WorldWideScience

Sample records for biosystem analyzing technology

  1. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  2. Biosystems Engineering in Portugal

    OpenAIRE

    Marques da Silva, José Rafael; Silva, Luis Leopoldo; Cruz, Vasco Fitas

    2008-01-01

    The paper gives the definition of Biosystems Engineering in Portugal; Possible revisions of the core curriculum presented in the FEANI report; the current situation of Biosystems Engineering in Portugal; The impacts of the transition to Biosystems Engineering; The need for a transition to Biosystems Engineering;Opportunities to the Biosystems Engineer in the labour market.

  3. Fiscal 1997 industrial technology R and D project. Research report on development of use technology of bio- resources such as complex biosystem (Development of use and production technologies of complex biosystems); 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei riyo seisan gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    This project aims to establish production technology of functional substances, oil degradation and purification technology, and use technology of unused oil fraction through development of cultivation control technology of complex biosystems. For functional material production technology, as functional substances some specific marine bacteria inhibitors, antibreeding substances of microalgae, and UV absorption substances were isolated. The productivity of korormicin as specific inhibitor against marine bacteria was improved considerably by cultivation method. For research on molecular genetic analysis technology, a new identification technology and a simple automatic analysis system of microeucaryotes using genes were developed. For global environment purification technology such as efficient degradation of pollutants, study was made on cultivation control technology of phenol- degrading consortia, population dynamics of oil-degrading microbial consortia and a restoration method of oil pollution by complex biosystem at lower temperature, and the demonstration experiment of oil degradation were carried out. (NEDO)

  4. Performance of Identifiler Direct and PowerPlex 16 HS on the Applied Biosystems 3730 DNA Analyzer for processing biological samples archived on FTA cards.

    Science.gov (United States)

    Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal

    2012-09-01

    Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. Fiscal 1997 research report on development of use technology of bio-resources such as complex biosystem (Development of use and production technologies of complex biosystems); 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho (fukugo seibutsukei riyo seisan gijutsu no kaihatsu)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    This report summarizes the fiscal 1997 research results on production technology of functional materials, and the research study result for utilizing substance production and decomposition by complex biosystem for industries. For detecting technique of microorganisms in soil, staining by CFDA-AM was suitable, and allowed visualization of sulfate- reducing bacteria during culture. For research on functional analysis technology, study was made on atomic force microscope technology to detect microorganisms in environment and observe their fine structures. For research on detection, separation and cultivation technology of difficult-to-culture microorganisms, a molecular genetic analysis method of microbial communities and a determination method of their viability were selected. For functional substance production technology, study was made on technical utilization of bioflocculant produced by microbial consortia as environment-friendly oil removal reagent, and on gene transfer control in microbial consortia. For the research study for the project, survey was made on useful substances produced by complex biosystem. (NEDO)

  6. BioSystems

    Data.gov (United States)

    U.S. Department of Health & Human Services — The NCBI BioSystems Database provides integrated access to biological systems and their component genes, proteins, and small molecules, as well as literature...

  7. Human Performance and Biosystems

    Science.gov (United States)

    2013-03-08

    Fuel Cells • Artificial Photosynthesis Overview of Topic Areas 3003 Human Performance/Biosystems • Photo-Electro-Magnetic Stimulation of...1) Electronic transport in bacterial nanowires was demonstrated using nanofabrication enabled approaches (2) Identified the biophysical... bacterial nanowires and outer-membrane vesicles enhancing the electron transfer and respiration of individual cells Outlook The first demonstration

  8. Industrial biosystems engineering and biorefinery systems.

    Science.gov (United States)

    Chen, Shulin

    2008-06-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development.

  9. Phenomena of synchronized response in biosystems and the possible mechanism.

    Science.gov (United States)

    Xu, Jingjing; Yang, Fan; Han, Danhong; Xu, Shengyong

    2018-02-05

    Phenomena of synchronized response is common among organs, tissues and cells in biosystems. We have analyzed and discussed three examples of synchronization in biosystems, including the direction-changing movement of paramecia, the prey behavior of flytraps, and the simultaneous discharge of electric eels. These phenomena and discussions support an electrical communication mechanism that in biosystems, the electrical signals are mainly soliton-like electromagnetic pulses, which are generated by the transient transmembrane ionic current through the ion channels and propagate along the dielectric membrane-based softmaterial waveguide network to complete synchronized responses. This transmission model implies that a uniform electrical communication mechanism might have been naturally developed in biosystem. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. The NCBI BioSystems database.

    Science.gov (United States)

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  11. Controlled structure and properties of silicate nanoparticle networks for incorporation of biosystem components

    International Nuclear Information System (INIS)

    Sakai-Kato, Kumiko; Kawanishi, Toru; Hasegawa, Toshiaki; Takaoka, Akio; Kato, Masaru; Toyo'oka, Toshimasa; Utsunomiya-Tate, Naoko

    2011-01-01

    Inorganic nanoparticles are of technological interest in many fields. We created silicate nanoparticle hydrogels that effectively incorporated biomolecules that are unstable and involved in complicated reactions. The size of the silicate nanoparticles strongly affected both the physical characteristics of the resulting hydrogel and the activity of biomolecules incorporated within the hydrogel. We used high-resolution transmission electron microscopy (TEM) to analyze in detail the hydrogel network patterns formed by the silicate nanoparticles. We obtained clear nanostructured images of biomolecule-nanoparticle composite hydrogels. The TEM images also showed that larger silicate nanoparticles (22 nm) formed more loosely associated silicate networks than did smaller silicate nanoparticles (7 nm). The loosely associated networks formed from larger silicate nanoparticles might facilitate substrate diffusion through the network, thus promoting the observed increased activity of the entrapped biomolecules. This doubled the activity of the incorporated biosystems compared with that of biosystems prepared by our own previously reported method. We propose a reaction scheme to explain the formation of the silicate nanoparticle networks. The successful incorporation of biomolecules into the nanoparticle hydrogels, along with the high level of activity exhibited by the biomolecules required for complicated reaction within the gels, demonstrates the nanocomposites' potential for use in medical applications.

  12. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  13. Analyzing Preservice Teachers' Attitudes towards Technology

    Science.gov (United States)

    Akturk, Ahmet Oguz; Izci, Kemal; Caliskan, Gurbuz; Sahin, Ismail

    2015-01-01

    Rapid developments in technology in the present age have made it necessary for communities to follow technological developments and adapt themselves to these developments. One of the fields that are most rapidly affected by these developments is undoubtedly education. Determination of the attitudes of preservice teachers, who live in an age of…

  14. The Kernel Estimation in Biosystems Engineering

    Directory of Open Access Journals (Sweden)

    Esperanza Ayuga Téllez

    2008-04-01

    Full Text Available In many fields of biosystems engineering, it is common to find works in which statistical information is analysed that violates the basic hypotheses necessary for the conventional forecasting methods. For those situations, it is necessary to find alternative methods that allow the statistical analysis considering those infringements. Non-parametric function estimation includes methods that fit a target function locally, using data from a small neighbourhood of the point. Weak assumptions, such as continuity and differentiability of the target function, are rather used than "a priori" assumption of the global target function shape (e.g., linear or quadratic. In this paper a few basic rules of decision are enunciated, for the application of the non-parametric estimation method. These statistical rules set up the first step to build an interface usermethod for the consistent application of kernel estimation for not expert users. To reach this aim, univariate and multivariate estimation methods and density function were analysed, as well as regression estimators. In some cases the models to be applied in different situations, based on simulations, were defined. Different biosystems engineering applications of the kernel estimation are also analysed in this review.

  15. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of technologies for producing substitute fuel for petroleum by utilizing organisms; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Seibutsu riyo sekiyu daitai nenryo seizo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Technologies of producing useful substances using the substance decomposing/producing functions of complex biosystems and methods of their handling are developed. In the utilization of microbes in the digestive tracts of termites and longicorns, it is made clear that several kinds of termites cleave the {beta}-O-4 ether linkage. In relation to technologies for wood decomposing complex microbial system construction and complex vector system development, a screening system is constructed in which strains that exhibit complex actions are combined. Concerning the advanced utilization of tropical oil plants, conditions are determined for inducing callus out of oil palm tissues. Out of oil palm sarcocarp tissues, mRNA (messenger ribonucleic acid) is isolated for the construction of a cDNA (complementary deoxyribonucleic acid) library. For the purpose of isolating a powerful promoter, a partial base sequence is determined for ubiquitin that frequently expresses itself in cells. A pathogenic bacterium ailing the oil palm is sampled for identification, and it is inferred that the bacterium is a kind of Ganoderma boninense. (NEDO)

  16. Abstracts of the 17. world congress of the International Commission of Agriculture and Biosystems Engineering (CIGR) : sustainable biosystems through engineering

    Energy Technology Data Exchange (ETDEWEB)

    Savoie, P.; Villeneuve, J.; Morisette, R. [Agriculture and Agri-Food Canada, Quebec City, PQ (Canada). Soils and Crops Research and Development Centre] (eds.)

    2010-07-01

    This international conference provided a forum to discuss methods to produce agricultural products more efficiently through improvements in engineering and technology. It was attended by engineers and scientists working from different perspectives on biosystems. Beyond food, farms and forests can provide fibre, bio-products and renewable energy. Seven sections of CIGR were organized in the following technical sessions: (1) land and water engineering, (2) farm buildings, equipment, structures and environment, (3) equipment engineering for plants, (4) energy in agriculture, (5) management, ergonomics and systems engineering, (6) post harvest technology and process engineering, and (7) information systems. The Canadian Society of Bioengineering (CSBE) merged its technical program within the 7 sections of CIGR. Four other groups also held their activities during the conference. The American Society of Agricultural and Biological Engineers (ASABE) organized its 9th international drainage symposium and the American Ecological Engineering Society (AEES) held its 10th annual meeting. The International Network for Information Technology in Agriculture (INFITA), and the 8th world congress on computers in agriculture also joined CIGR 2010.

  17. Financial options methodology for analyzing investments in new technology

    Energy Technology Data Exchange (ETDEWEB)

    Wenning, B.D. [Texas Utilities Services, Inc., Dallas, TX (United States)

    1994-12-31

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  18. Financial options methodology for analyzing investments in new technology

    Science.gov (United States)

    Wenning, B. D.

    1995-01-01

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  19. Financial options methodology for analyzing investments in new technology

    International Nuclear Information System (INIS)

    Wenning, B.D.

    1994-01-01

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated

  20. An ISM approach for analyzing the factors in technology transfer

    Directory of Open Access Journals (Sweden)

    Mohammad Mahdavi Mazdeh

    2015-07-01

    Full Text Available Technology transfer, from research and technology organizations (RTOs toward local industries, is considered as one of important and significant strategies for countries' industrial development. In addition to recover the enormous costs of research and development for RTOs, successful technology transfer from RTOs toward local firms forms technological foundations and develops the ability to enhance the competitiveness of firms. Better understanding of factors influencing process of technology transfer helps RTOs and local firms prioritize and manage their resources in an effective and efficient way to maximize the success of technology transfer. This paper aims to identify important effective factors in technology transfer from Iranian RTOs and provides a comprehensive model, which indicate the interactions of these factors. In this regard, first, research background is reviewed and Cummings and Teng’s model (2003 [Cummings, J. L., & Teng, B.-S. (2003. Transferring R&D knowledge: The key factors affecting knowledge transfer success. Journal of Engineering and Technology Management, 20(1-2, 39-68.] was selected as the basic model in this study and it was modified through suggesting new factors identified from literature of inter-organizational knowledge and technology transfer and finally a Delphi method was applied for validation of modified model. Then, research conducted used Interpretive Structural Modeling (ISM to evaluate the relationship between the factors of final proposed model. Results indicate that there were twelve factors influencing on technology transfer process from Iranian RTOs to local firms and also the intensity of absorption capability in transferee could influence on the intensity of desorption capability in transferor.

  1. Analyzing the Diffusion of Chinese Rice Farming Technologies in ...

    African Journals Online (AJOL)

    During the Beijing Summit of the Forum on China-Africa Cooperation in 2006, the Chinese government pledged to build 10 agro-technology demonstration centres across Africa. Since then, the figure has increased to 25 and it will likely increase again in the future. One of the main goals of the demonstration centres is to ...

  2. Lesbians and tech: Analyzing digital media technologies and lesbian experience.

    Science.gov (United States)

    Harris, Angelique; Daniels, Jessie

    2017-11-28

    The rise of the popular Internet has coincided with the increasing acceptance, even assimilation, of lesbians into mainstream society. The visible presence of lesbians in the tech industry and in digitally mediated spaces raises a set of questions about the relationship between queer identities and Internet technologies. This introduction to a special issue of Journal of Lesbian Studies explores some of these questions and provides an overview of the articles that follow.

  3. Polarized 3He Gas Circulating Technologies for Neutron Analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David [Xemed LLC, Durham, NH (United States); Hersman, Bill [Xemed LLC, Durham, NH (United States)

    2014-12-10

    We describe the development of an integrated system for quasi-continuous operation of a large volume neutron analyzer. The system consists of a non-magnetic diaphragm compressor, a prototype large volume helium polarizer, a surrogate neutron analyzer, a non-depolarizing gas storage reservoir, a non-ferrous valve manifold for handling gas distribution, a custom rubidium-vapor gas return purifier, and wire-wound transfer lines, all of which are immersed in a two-meter external magnetic field. Over the Phase II period we focused on three major tasks required for the successful deployment of these types of systems: 1) design and implementation of gas handling hardware, 2) automation for long-term operation, and 3) improvements in polarizer performance, specifically fabrication of aluminosilicate optical pumping cells. In this report we describe the design, implementation, and testing of the gas handling hardware. We describe improved polarizer performance resulting from improved cell materials and fabrication methods. These improvements yielded valved 8.5 liter cells with relaxation times greater than 12 hours. Pumping this cell with 1500W laser power with 1.25nm linewidth yielded peak polarizations of 60%, measured both inside and outside the polarizer. Fully narrowing this laser to 0.25nm, demonstrated separately on one stack of the four, would have allowed 70% polarization with this cell. We demonstrated the removal of 5 liters of polarized helium from the polarizer with no measured loss of polarization. We circulated the gas through a titanium-clad compressor with polarization loss below 3% per pass. We also prepared for the next phase of development by refining the design of the polarizer so that it can be engineer-certified for pressurized operation. The performance of our system far exceeds comparable efforts elsewhere.

  4. Polarized 3He gas circulating technologies for neutron analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David W. [Xemed, LLC, Durham, NH (United States)

    2017-10-02

    We outline our project to develop a circulating polarized helium-3 system for developing of large, quasi-continuously operating neutron analyzers. The project consisted of four areas: 1) Development of robust external cavity narrowed diode laser output with spectral line width < 0.17 nm and power of 2000 W. 2) Development of large glass polarizing cells using cell surface treatments to obtain long relaxation lifetimes. 3) Refinements of the circulation system with an emphasis on gas purification and materials testing. 4) Design/fabrication of a new polarizer system. 5) Preliminary testing of the new polarizer. 1. Developed Robust High-Power Narrowed Laser The optical configuration of the laser was discussed in the proposal and will be reviewed in the body of this report. The external cavity is configured to mutually lock the wavelength of five 10-bar laser stacks. All the logistical milestones were been met and critical subsystems- laser stack manifold and power divider, external laser cavity, and output telescope- were assembled and tested at low power. Each individual bar is narrowed to ~0.05 nm; when combined the laser has a cumulative spectral width of 0.17 nm across the entire beam due to variations of the bars central wavelength by +/- 0.1 nm, which is similar to that of Volume Bragg Grating narrowed laser bars. This configuration eliminates the free-running “pedestal” that occurs in other external cavity diode lasers. The full-scale laser was completed in 2016 and was used in both the older and newer helium polarizers. This laser was operated at 75% power for periods of up to 8 hours. Once installed, the spectrum became slightly broader (~.25 nm) at full power; this is likely due to very slight misalignments that occurred during handling. 2. Developed the processes to create uniform sintered sol-gel coatings. Our work on cell development comprised: 1) Production of large GE180 cells and explore different means of cell preparation, and 2) Development of

  5. Fiscal 1998 industrial science and technology R and D project. Research report on R and D of genome informatics technology (Development of stable oil supply measures using complex biosystem); 1998 nendo genome informatics gijutsu kenkyu kaihtsu seika hokokusho. Fukugo seibutsukei riyo sekiyu antei kyokyu taisaku kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This report describes the fiscal 1998 result on development of genome informatics technology. As comparative analysis technique of genes, the combination of electrophoresis and PCR was used. For improvement of the throughput and reproducibility of the technique, module- shuffling primers were used, and the multi(96)-arrayed capillary fragment analyzer was devised. The system detecting SNPs rapidly was also developed successfully. As analysis technology of DNA sequence by use of triple- stranded DNA formation, study was made on construction of long cDNA libraries, selective subtraction of specific sequences from libraries, and the basic technology of homologous cloning. Study was also made on each reaction step of IGCR technique for fast analysis, and specifications of a fluorescence transfer monitor. As modeling technique of genetic sequence information, the simulation model was developed for gene expression regulatory networks during muscle differentiation, and feedback regulation of period genes. Such support systems as transcription factor prediction and gene regulatory network inference were developed from existing data. (NEDO)

  6. Application of ZigBee technology in the X fluorescence analyzer

    International Nuclear Information System (INIS)

    Wang Haibin; Yang Jian; Cao Bihua; Zhao Xiang

    2010-01-01

    In order to ensure safe and convenient measurement of nuclear radiation, a ZigBee- technology- based X- fluorescence analyzer is developed to be used in laboratories to make quantitative and qualitative analysis of radioactive samples. (authors)

  7. Health effects of low-dose radiation: Molecular, cellular, and biosystem response

    International Nuclear Information System (INIS)

    Pollycove, M.; Paperiello, C.J.

    1997-01-01

    Since the fifties, the prime concern of radiation protection has been protecting DNA from damage. UNSCEAR initiated a focus on biosystem response to damage with its 1994 report, ''Adaptive Responses to Radiation of Cells and Organisms''. The DNA damage-control biosystem is physiologically operative on both metabolic and radiation induced damage, both effected predominantly by free radicals. These adaptive responses are suppressed by high-dose and stimulated by low dose radiation. Increased biosystem efficiently reduces the number of mutations that accumulate during a lifetime and decrease DNA damage-control with resultant aging and malignancy. Several statistically significant epidemiologic studies have shown risk decrements of cancer mortality and mortality from all causes in populations exposed to low-dose radiation. Further biologic and epidemiologic research is needed to establish a valid threshold below which risk decrements occur. (author)

  8. Protein-based nanostructures as carriers for photo-physically active molecules in biosystems

    OpenAIRE

    Delcanale, Pietro

    2017-01-01

    In nature, many proteins function as carriers, being able to bind, transport and possibly release a ligand within a biological system. Protein-based carriers are interesting systems for drug delivery, with the remarkable advantage of being water-soluble and, as inherent components of biosystems, highly bio-compatible. This work focuses on the use of protein-based carriers for the delivery of hydrophobic photo-physically active molecules, whose structure and chemical properties lead to spontan...

  9. A Hybrid Method of Analyzing Patents for Sustainable Technology Management in Humanoid Robot Industry

    Directory of Open Access Journals (Sweden)

    Jongchan Kim

    2016-05-01

    Full Text Available A humanoid, which refers to a robot that resembles a human body, imitates a human’s intelligence, behavior, sense, and interaction in order to provide various types of services to human beings. Humanoids have been studied and developed constantly in order to improve their performance. Humanoids were previously developed for simple repetitive or hard work that required significant human power. However, intelligent service robots have been developed actively these days to provide necessary information and enjoyment; these include robots manufactured for home, entertainment, and personal use. It has become generally known that artificial intelligence humanoid technology will significantly benefit civilization. On the other hand, Successful Research and Development (R & D on humanoids is possible only if they are developed in a proper direction in accordance with changes in markets and society. Therefore, it is necessary to analyze changes in technology markets and society for developing sustainable Management of Technology (MOT strategies. In this study, patent data related to humanoids are analyzed by various data mining techniques, including topic modeling, cross-impact analysis, association rule mining, and social network analysis, to suggest sustainable strategies and methodologies for MOT.

  10. Molecular and Cell Mechanisms of Singlet Oxygen Effect on Biosystems

    OpenAIRE

    Martusevich А.А.; Peretyagin S.P.; Martusevich А.К.

    2012-01-01

    There has been considered a poorly studied form of activated oxygen — singlet oxygen. Its physicochemical properties (electron configuration of a molecule, reactive capacity, features) are analyzed, and enzymic and nonenzymic ways of singlet oxygen generation in body are specified. There are shown in detail biological effects of the compound as a regulator of cell activity including that determining the mechanism of apoptosis initiation. The relation of singlet oxygen and photodynamic effect ...

  11. Elucidation of functions of micro-organisms and animals in forest biosystem. Shinrin seitaikei ni okeru biseibutsu oyobi dobutsu no kino no kaimei

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-25

    This paper describes a report on elucidating functions of micro-organisms and animals in a forest biosystem. Classification of forest micro-organisms and elucidation of their physiology, ecology, and roles in the biosystem: Characteristics of tree root putterfaction bacteria, which cause withering of windbreaks in the Ishigaki Island, Japan were elucidated, and identifying the culture hyphae has become possible. Chemicals effective for their control were discovered, which enable their extermination. Investigations on soil molds using artificial acid rains clarified that the exterminating agents display their effects when sprinkled repeatedly over an extended period even in low concentrations. Classification of forest animals and elucidation of their physiology, ecology, and interactions among animals: A method was developed to photograph three-dimensionally the shapes of perforations made by earthworms using a soft X-ray and analyze them using a computer, which is being used for investigation. The perforation pattern is complex, and the hole diameters are in proportion with sizes of earthworms. Taxonomic studies on the Japanese lesser grain borers are close to completion. Damping-off of certain kinds of plantr exhibited parasitism of a kind of grain borerr and mycobionts without exceptions. An artificial burrow was devised for ecological investigation on field mice. 1 tab.

  12. Data integration, systems approach and multilevel description of complex biosystems

    International Nuclear Information System (INIS)

    Hernández-Lemus, Enrique

    2013-01-01

    Recent years have witnessed the development of new quantitative approaches and theoretical tenets in the biological sciences. The advent of high throughput experiments in genomics, proteomics and electrophysiology (to cite just a few examples) have provided the researchers with unprecedented amounts of data to be analyzed. Large datasets, however can not provide the means to achieve a complete understanding of the underlying biological phenomena, unless they are supplied with a solid theoretical framework and with proper analytical tools. It is now widely accepted that by using and extending some of the paradigmatic principles of what has been called complex systems theory, some degree of advance in this direction can be attained. We will be presenting ways in which by using data integration techniques (linear, non-linear, combinatorial, graphical), multidimensional-multilevel descriptions (multifractal modeling, dimensionality reduction, computational learning), as well as an approach based in systems theory (interaction maps, probabilistic graphical models, non-equilibrium physics) have allowed us to better understand some problems in the interface of Statistical Physics and Computational Biology

  13. Avoiding inconsistencies over time and tracking difficulties in Applied Biosystems AB1700™/Panther™ probe-to-gene annotations

    Directory of Open Access Journals (Sweden)

    Benecke Arndt

    2005-12-01

    Full Text Available Abstract Background Significant inconsistencies between probe-to-gene annotations between different releases of probe set identifiers by commercial microarray platform solutions have been reported. Such inconsistencies lead to misleading or ambiguous interpretation of published gene expression results. Results We report here similar inconsistencies in the probe-to-gene annotation of Applied Biosystems AB1700 data, demonstrating that this is not an isolated concern. Moreover, the online information source PANTHER does not provide information required to track such inconsistencies, hence, even correctly annotated datasets, when resubmitted after PANTHER was updated to a new probe-to-gene annotation release, will generate differing results without any feedback on the origin of the change. Conclusion The importance of unequivocal annotation of microarray experiments can not be underestimated. Inconsistencies greatly diminish the usefulness of the technology. Novel methods in the analysis of transcriptome profiles often rely on large disparate datasets stemming from multiple sources. The predictive and analytic power of such approaches rapidly diminishes if only least-common subsets can be used for analysis. We present here the information that needs to be provided together with the raw AB1700 data, and the information required together with the biologic interpretation of such data to avoid inconsistencies and tracking difficulties.

  14. Analyzing the causes of urban waterlogging and sponge city technology in China

    Science.gov (United States)

    Ning, Yun-Fang; Dong, Wen-Yi; Lin, Lu-Sheng; Zhang, Qian

    2017-03-01

    With the rapid development of social economy in China, increased urban population, and rapid urbanization cause serious problems, for example, a heavy rain in the city inevitably leads to waterlogging, which poses a great threat to the livelihood and property security. Disaster due to urban flood is a key problem that restricts the development of urban ecology in China. The reason is the sharp increase of impermeable surface ratio in urban areas, leading to a decrease in rainfall infiltration and increase in surface runoff. To effectively solve the urban waterlogging, China proposed the construction of sponge city. This paper analyzes and summarizes the reasons for the formation of urban waterlogging, and introduces the concept of the sponge city technology to prevent waterlogging.

  15. Analyzing the effects of information technology on supply chain integration: The role of ERP success mediator

    Directory of Open Access Journals (Sweden)

    Samaneh Alimohamadian

    2014-04-01

    Full Text Available This research analyzes the effects of Information Technology (IT on Supply Chain Integration (SCI through ERP mediator by proposing a conceptual model among these components. We also hypothesize that three constructs of IT influence on enterprise resource planning (ERP success and one construct of ERP success influences on SCI. To clarify the relationships among the constructs, structural equation model (SEM is conducted to examine the model fit and seven hypotheses. The data was collected from three Iranian firms through questionnaire with 23 questions adopted by past researches. The results confirmed that top management support of IT and employees’ general IT skills factors of IT enhance ERP success, and ERP success positively influences on Supply Chain Integration, so these two IT factors influence Supply Chain Integration through ERP success. Our data unsupported negative impact of satisfaction with legacy IT system on ERP success.

  16. Baropodometric technology used to analyze types of weight-bearing during hemiparetic upright position

    Directory of Open Access Journals (Sweden)

    Lidiane Teles de Menezes

    Full Text Available INTRODUCTION: Although baropodometric analysis has been published since the 1990s, only now it is found a considerable number of studies showing different uses in the rehabilitation. OBJECTIVE: To amplify the use of this technology, this research aimed to analyze baropodometric records during upright position of subjects with hemiparesis, describing a way to define weight-bearing profiles in this population. METHOD: 20 healthy subjects were matched by gender and age with 12 subjects with chronic spastic hemiparesis. This control group was formed to establish the limits of symmetry during weight-bearing distribution in the hemiparesis group. Next, hemiparesis group was submitted to procedures to measure baropodometric records used to provide variables related to the weight-bearing distribution, the arch index and the displacements in the center of pressure (CoP. Data were used to compare differences among kinds of weight-bearing distribution (symmetric, asymmetric toward non-paretic or paretic foot and coordination system for CoP displacements. RESULTS: Hemiparesis group was compounded by eight symmetrics, eight asymmetrics toward non-paretic foot and four asymmetric toward paretic foot. Significant differences in the weight-bearing distributions between non-predominantly and predominantly used foot did not promote differences in the other baropodometric records (peak and mean of pressure, and support area. Mainly in the asymmetry toward non-paretic foot it was observed significant modifications of the baropodometric records. CONCLUSION: Baropodometric technology can be used to analyze weight-bearing distribution during upright position of subjects with hemiparesis, detecting different kinds of weight-bearing profiles useful to therapeutic programs and researches involving subjects with this disability.

  17. Production of biofuels and biochemicals by in vitro synthetic biosystems: Opportunities and challenges.

    Science.gov (United States)

    Zhang, Yi-Heng Percival

    2015-11-15

    The largest obstacle to the cost-competitive production of low-value and high-impact biofuels and biochemicals (called biocommodities) is high production costs catalyzed by microbes due to their inherent weaknesses, such as low product yield, slow reaction rate, high separation cost, intolerance to toxic products, and so on. This predominant whole-cell platform suffers from a mismatch between the primary goal of living microbes - cell proliferation and the desired biomanufacturing goal - desired products (not cell mass most times). In vitro synthetic biosystems consist of numerous enzymes as building bricks, enzyme complexes as building modules, and/or (biomimetic) coenzymes, which are assembled into synthetic enzymatic pathways for implementing complicated bioreactions. They emerge as an alternative solution for accomplishing a desired biotransformation without concerns of cell proliferation, complicated cellular regulation, and side-product formation. In addition to the most important advantage - high product yield, in vitro synthetic biosystems feature several other biomanufacturing advantages, such as fast reaction rate, easy product separation, open process control, broad reaction condition, tolerance to toxic substrates or products, and so on. In this perspective review, the general design rules of in vitro synthetic pathways are presented with eight supporting examples: hydrogen, n-butanol, isobutanol, electricity, starch, lactate,1,3-propanediol, and poly-3-hydroxylbutyrate. Also, a detailed economic analysis for enzymatic hydrogen production from carbohydrates is presented to illustrate some advantages of this system and the remaining challenges. Great market potentials will motivate worldwide efforts from multiple disciplines (i.e., chemistry, biology and engineering) to address the remaining obstacles pertaining to cost and stability of enzymes and coenzymes, standardized building parts and modules, biomimetic coenzymes, biosystem optimization, and scale

  18. Changes of brain structure in Parkinson's disease patients with mild cognitive impairment analyzed via VBM technology.

    Science.gov (United States)

    Gao, Yuyuan; Nie, Kun; Huang, Biao; Mei, Mingjin; Guo, Manli; Xie, Sifen; Huang, Zhiheng; Wang, Limin; Zhao, Jiehao; Zhang, Yuhu; Wang, Lijuan

    2017-09-29

    To analyze changes in cerebral grey matter volume and white matter density in non-dementia Parkinson's disease patients using voxel-based morphometry (VBM) technology; to investigate features of brain structure changes in Parkinson's disease patients with mild cognitive impairment (PD-MCI), and reveal their intrinsic pathological changes. Based on the diagnostic criteria of PD-MCI, 23 PD-MCI patients, 23 Parkinson's disease patients with normal cognition (PD-NC), and 21 age- and gender-matched healthy people were recruited for the study. Scans were performed on all subjects on a 3.0T MR scanner to obtain brain structural magnetic resonance images. Images were preprocessed using the VBM8 tool from SPM8 software package on the Matlab R2008a platform, and data were then analyzed using the SPM statistical software package to compare the differences of grey matter volume and white matter density between groups, and to evaluate the brain structural changes corresponding to the overall cognitive function. Compared to the control group, the PD-NC group suffered from grey matter atrophy, mainly found in the prefrontal lobe, limbic lobe and left temporal gyrus. The PD-MCI group suffered from grey matter atrophy found in the frontal lobe, limbic lobe, basal ganglia and cerebellum. Compared to the PD-NC group, the PD-MCI group suffered from grey matter atrophy found in the left-side middle temporal gyrus, inferior temporal gyrus and frontal lobe. The grey matter regions correlated with MMSE score (mainly memory related) including the right cingulate gyrus and the limbic lobe. The grey matter regions correlated with MoCA score (mainly non-memory related) including the frontal lobe, basal ganglia, parahippocampal gyrus, occipital lobe and the cerebellum. Additionally, overall cognitive function in non-dementia PD was mainly located in the frontal and limbic system, and was dominated by subcortical atrophy. Structural changes in PD-MCI patients are associated with overall

  19. A rapid automatic analyzer and its methodology for effective bentonite content based on image recognition technology

    Directory of Open Access Journals (Sweden)

    Wei Long

    2016-09-01

    Full Text Available Fast and accurate determination of effective bentonite content in used clay bonded sand is very important for selecting the correct mixing ratio and mixing process to obtain high-performance molding sand. Currently, the effective bentonite content is determined by testing the ethylene blue absorbed in used clay bonded sand, which is usually a manual operation with some disadvantages including complicated process, long testing time and low accuracy. A rapid automatic analyzer of the effective bentonite content in used clay bonded sand was developed based on image recognition technology. The instrument consists of auto stirring, auto liquid removal, auto titration, step-rotation and image acquisition components, and processor. The principle of the image recognition method is first to decompose the color images into three-channel gray images based on the photosensitive degree difference of the light blue and dark blue in the three channels of red, green and blue, then to make the gray values subtraction calculation and gray level transformation of the gray images, and finally, to extract the outer circle light blue halo and the inner circle blue spot and calculate their area ratio. The titration process can be judged to reach the end-point while the area ratio is higher than the setting value.

  20. Analyzing the Degree of Technology Use Occurring in Pre-Service Teacher Education

    Science.gov (United States)

    Stobaugh, Rebecca Ruth; Tassell, Janet Lynne

    2011-01-01

    Technology is not in the forefront of teacher education program thinking and planning. Yet, it is the tool dramatically changing education. Internationally, the role of technology has evolved from the role of assisting the teacher in personal management to using technology for instruction. To respond to this need, universities are altering courses…

  1. Education of indoor enviromental engineering technology

    Czech Academy of Sciences Publication Activity Database

    Kic, P.; Zajíček, Milan

    2011-01-01

    Roč. 9, Spec. 1 (2011), s. 83-90 ISSN 1406-894X. [Biosystems Engineering 2011. Tartu, 12.05.2011-13.05.2011] Institutional research plan: CEZ:AV0Z10750506 Keywords : Biosystems engineering * indoor environment * study * programs Subject RIV: AM - Education http://library.utia.cas.cz/separaty/2011/VS/zajicek-education of indoor enviromental engineering technology.pdf

  2. Applying Technology Management concepts in analyzing e Waste, sustainability and technology development in Mobile Industry: A conceptual perspective

    OpenAIRE

    Lasrado, Lester Allan; Agnihothri, Subodh; Lugmayr, Artur

    2013-01-01

    In the highly globalized, competitive and technocrat world, mobile industry is heavily focused on making itself sustainable. In order to achieve this focus should be on improving the e waste management in the industry. Currently the industry is advanced beyond market demand in delivery services to customers in terms of ICT and smart phones. This research paper is trying to conceptualize the aspect of technology management by comparing technology advancement of mobile phone technology and the ...

  3. Analyzing interdependencies between policy mixes and technological innovation systems : The case of offshore wind in Germany

    NARCIS (Netherlands)

    Reichardt, Kristin; Negro, Simona O.; Rogge, Karoline S.; Hekkert, Marko P.

    2016-01-01

    One key approach for studying emerging technologies in the field of sustainability transitions is that of technological innovation systems (TIS). While most TIS studies aim at deriving policy recommendations - typically by identifying system barriers - the actual role of these proposed policies in

  4. Research activities in the first two cycles of European Biosystems engineering university studies - Situation in the Netherlands

    NARCIS (Netherlands)

    Hofstee, J.W.

    2009-01-01

    Wageningen University has implemented the bachelor – master model by 2003. The biosystems related programmes of Wageningen University are the BSc Agrotechnology and the MSc Agricultural and Bioresource Engineering. The bachelor programme has a size of 180 credits and the master programme a size of

  5. Transitioning Submersible Chemical Analyzer Technologies for Sustained, Autonomous Observations from Profiling Moorings, Gliders and other AUVs

    National Research Council Canada - National Science Library

    Hanson, Alfred K; Donaghay, Percy L; Moore, Casey; Arrieta, Richard

    2005-01-01

    The long term goal is to transition existing prototype autonomous profiling nutrient analyzers into commercial products that can be readily deployed on autonomous profiling moorings, coastal gliders...

  6. Transitioning Submersible Chemical Analyzer Technologies for Sustained, Autonomous Observations From Profiling Moorings, Gliders and other AUVs

    National Research Council Canada - National Science Library

    Hanson, Alfred K; Donaghay, Percy L; Moore, Casey; Arrieta, Richard

    2006-01-01

    The long term goal is to transition existing prototype autonomous profiling nutrient analyzers into commercial products that can be readily deployed on autonomous profiling moorings, coastal gliders...

  7. Third cycle university studies in Europe in the field of agricultural engineering and in the emerging discipline of biosystems engineering.

    Science.gov (United States)

    Ayuga, F; Briassoulis, D; Aguado, P; Farkas, I; Griepentrog, H; Lorencowicz, E

    2010-01-01

    The main objectives of European Thematic Network entitled 'Education and Research in Agricultural for Biosystems Engineering in Europe (ERABEE-TN)' is to initiate and contribute to the structural development and the assurance of the quality assessment of the emerging discipline of Biosystems Engineering in Europe. ERABEE is co-financed by the European Community in the framework of the LLP Programme. The partnership consists of 35 participants from 27 Erasmus countries, out of which 33 are Higher Education Area Institutions (EDU) and 2 are Student Associations (ASS). 13 Erasmus participants (e.g. Thematic Networks, Professional Associations, and Institutions from Brazil, Croatia, Russia and Serbia) are also involved in the Thematic Network through synergies. To date, very few Biosystems Engineering programs exist in Europe and those that are initiated are at a very primitive stage of development. The innovative and novel goal of the Thematic Network is to promote this critical transition, which requires major restructuring in Europe, exploiting along this direction the outcomes accomplished by its predecessor; the USAEE-TN (University Studies in Agricultural Engineering in Europe). It also aims at enhancing the compatibility among the new programmes of Biosystems Engineering, aiding their recognition and accreditation at European and International level and facilitating greater mobility of skilled personnel, researchers and students. One of the technical objectives of ERABEE is dealing with mapping and promoting the third cycle studies (including European PhDs) and supporting the integration of research at the 1st and 2nd cycle regarding European Biosystems Engineering university studies. During the winter 2008 - spring 2009 period, members of ERABEE conducted a survey on the contemporary status of doctoral studies in Europe, and on a possible scheme for promotion of cooperation and synergies in the framework of the third cycle of studies and the European Doctorate

  8. Technological change and salary variation in Mexican regions: Analyzing panel data for the service sector

    Directory of Open Access Journals (Sweden)

    Mario Camberos C.

    2013-07-01

    Full Text Available In this paper Hypothesis Biased Technological Change is applied for Mexican workers services sector, belonging several Mexican regions. Economics Census microdata, 1998, 2003 and 2008 are used. Hypothesis is proved with technological gaps, under consideration of different index and result statistics consistency by taking account panel analysis. Mayor wages differences at 2008 year were find out between Capital region and South one, about five hundred percent on 1998 year; but it was lower on 2008, two hundred percent. This result is in correspondence with diminishing technological gap, perhaps caused by economic crisis impact.

  9. A thermodynamic perspective on technologies in the Anthropocene : analyzing environmental sustainability

    NARCIS (Netherlands)

    Liao, Wenjie

    2012-01-01

    Technologies and sustainable development are interrelated from a thermodynamic perspective, with industrial ecology (IE) as a major point of access for studying the relationship in the Anthropocene. To offer insights into the potential offered by thermodynamics in the environmental sustainability

  10. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can observe eddy covariance flux of CO2 from unmanned airborne platforms. For both phases, a total of four...

  11. [Adoption of new technologies by health services: the challenge of analyzing relevant factors].

    Science.gov (United States)

    Trindade, Evelinda

    2008-05-01

    The exponential increase in the incorporation of health technologies has been considered a key factor in increased expenditures by the health sector. Such decisions involve multiple levels and stakeholders. Decentralization has multiplied the decision-making levels, with numerous difficult choices and limited resources. The interrelationship between stakeholders is complex, in creative systems with multiple determinants and confounders. The current review discusses the interaction between the factors influencing the decisions to incorporate technologies by health services, and proposes a structure for their analysis. The application and intensity of these factors in decision-making and the incorporation of products and programs by health services shapes the installed capacity of local and regional networks and modifies the health system. Empirical observation of decision-making and technology incorporation in Brazilian health services poses an important challenge. The structured recognition and measurement of these variables can assist proactive planning of health services.

  12. Multimodal methodologies for analyzing preschool children’s engagements with digital technologies

    DEFF Research Database (Denmark)

    Chimirri, Niklas Alexander

    and between reality and virtuality. In its stead, it suggests understanding technologies as mediating devices or artifacts, which offer the possibility to communicate with others for the sake of negotiating and actualizing imaginations together, for the sake of transforming pedagogical practice...... and problems of children can be accounted for throughout the technologically mediated collective development of a pedagogical practice – how it can be avoided that the adults’ perspectives take the sole lead throughout this collective engagement. What lies at the heart of these challenges is the question...

  13. Frames for Learning Science: Analyzing Learner Positioning in a Technology-Enhanced Science Project

    Science.gov (United States)

    Silseth, K.; Arnseth, H. C.

    2016-01-01

    In this article, we examine the relationship between how students are positioned in social encounters and how this influences learning in a technology-supported science project. We pursue this topic by focusing on the participation trajectory of one particular learner. The analysis shows that the student cannot be interpreted as one type of…

  14. Knowledge flows : analyzing the core literature of innovation, entrepreneurship and science and technology studies

    NARCIS (Netherlands)

    Bhupatiraju, S.; Nomaler, Z.O.; Triulzi, G.; Verspagen, B.

    2012-01-01

    This paper applies network analysis to a citation database that combines the key references in the fields of Entrepreneurship (ENT), Innovation Studies (INN) and Science and Technology Studies (STS). We find that citations between the three fields are relatively scarce, as compared to citations

  15. Analyzing the Discourse of Chais Conferences for the Study of Innovation and Learning Technologies via a Data-Driven Approach

    Directory of Open Access Journals (Sweden)

    Vered Silber-Varod

    2016-12-01

    Full Text Available The current rapid technological changes confront researchers of learning technologies with the challenge of evaluating them, predicting trends, and improving their adoption and diffusion. This study utilizes a data-driven discourse analysis approach, namely culturomics, to investigate changes over time in the research of learning technologies. The patterns and changes were examined on a corpus of articles published over the past decade (2006-2014 in the proceedings of Chais Conference for the Study of Innovation and Learning Technologies – the leading research conference on learning technologies in Israel. The interesting findings of the exhaustive process of analyzing all the words in the corpus were that the most commonly used terms (e.g., pupil, teacher, student and the most commonly used phrases (e.g., face-to-face in the field of learning technologies reflect a pedagogical rather than a technological aspect of learning technologies. The study also demonstrates two cases of change over time in prominent themes, such as “Facebook” and “the National Information and Communication Technology (ICT program”. Methodologically, this research demonstrates the effectiveness of a data-driven approach for identifying discourse trends over time.

  16. Fusion of Nuclear and Emerging Technology

    International Nuclear Information System (INIS)

    Nahrul Khaer Alang Rashid

    2005-04-01

    The presentation discussed the following subjects: emerging technology; nuclear technology; fusion emerging and nuclear technology; progressive nature of knowledge; optically stimulated luminescence - application of luminescence technology to sediments; Biosystemics technology -convergence nanotechnology, ecological science, biotechnology, cognitive science and IT - prospective impact on materials science, the management of public system for bio-health, eco and food system integrity and disease mitigation

  17. Advancements in portable x-ray analyzer technology for real-time contaminant profiling in soil

    International Nuclear Information System (INIS)

    Piorek, S.

    1992-01-01

    During the last 4 years since the first published application of portable X-ray analyzers for on-site chemical characterization of contaminated soil, a field-portable X-ray fluorescence (FPXRF) established itself as the most promising technique for a broad range of environmental applications

  18. Analyzing the Curricula of Doctor of Philosophy in Educational Technology-Related Programs in the United States

    Science.gov (United States)

    Almaden, Abdullah; Ku, Heng-Yu

    2017-01-01

    The purpose of this study was to analyze on-campus and online PhD programs in educational technology-related fields in the United States. In particular, it sought to evaluate the most common program titles; core, elective, and research courses based on program curricula. The research design was quantitative content analysis and data were collected…

  19. The Fishbone diagram to identify, systematize and analyze the sources of general purpose technologies

    OpenAIRE

    COCCIA, Mario

    2017-01-01

    Abstract. This study suggests the fishbone diagram for technological analysis. Fishbone diagram (also called Ishikawa diagrams or cause-and-effect diagrams) is a graphical technique to show the several causes of a specific event or phenomenon. In particular, a fishbone diagram (the shape is similar to a fish skeleton) is a common tool used for a cause and effect analysis to identify a complex interplay of causes for a specific problem or event. The fishbone diagram can be a comprehensive theo...

  20. E-health beyond technology: analyzing the paradigm shift that lies beneath.

    Science.gov (United States)

    Moerenhout, Tania; Devisch, Ignaas; Cornelis, Gustaaf C

    2018-03-01

    Information and computer technology has come to play an increasingly important role in medicine, to the extent that e-health has been described as a disruptive innovation or revolution in healthcare. The attention is very much focused on the technology itself, and advances that have been made in genetics and biology. This leads to the question: What is changing in medicine today concerning e-health? To what degree could these changes be characterized as a 'revolution'? We will apply the work of Thomas Kuhn, Larry Laudan, Michel Foucault and other philosophers-which offers an alternative understanding of progress and revolution in medicine to the classic discovery-oriented approach-to our analysis. Nowadays, the long-standing curative or reactive paradigm in medicine is facing a crisis due to an aging population, a significant increase in chronic diseases and the development of more expensive diagnostic tools and therapies. This promotes the evolution towards a new paradigm with an emphasis on preventive medicine. E-health constitutes an essential part of this new paradigm that seeks to solve the challenges presented by an aging population, skyrocketing costs and so forth. Our approach changes the focus from the technology itself toward the underlying paradigm shift in medicine. We will discuss the relevance of this approach by applying it to the surge in digital self-tracking through health apps and wearables: the recognition of the underlying paradigm shift leads to a more comprehensive understanding of self-tracking than a solely discovery-oriented or technology-focused view can provide.

  1. Analyzing nitrogen concentration using carrier illumination (CI) technology for DPN ultra-thin gate oxide

    International Nuclear Information System (INIS)

    Li, W.S.; Wu, Bill; Fan, Aki; Kuo, C.W.; Segovia, M.; Kek, H.A.

    2005-01-01

    Nitrogen concentration in the gate oxide plays a key role for 90 nm and below ULSI technology. Techniques like secondary ionization mass spectroscopy (SIMS) and X-ray photoelectron spectroscopy (XPS) are commonly used for understanding N concentration. This paper describes the application of the carrier illuminationTM (CI) technique to measure the nitrogen concentration in ultra-thin gate oxides. A set of ultra-thin gate oxide wafers with different DPN (decoupled plasma nitridation) treatment conditions were measured using the CI technique. The CI signal has excellent correlation with the N concentration as measured by XPS

  2. Characterization of binding and mobility of metals and xenobiotics in continuous flow and soil biosystems

    International Nuclear Information System (INIS)

    Sunovska, A.

    2016-01-01

    The main aim of the dissertation thesis was to contribute to development of analytical tools and approaches application in characterization of binding and mobility of heavy metals and organic compounds (xenobiotics) in continuous flow and soil biosystems. Within the solution of this aim, a wide range of analytical methods (gamma-spectrometry, UV-VIS spectrophotometry, AAS, X-ray fluorescence spectrometry, ion chromatography, and stripping volt-amperometry) and approaches (mathematical modelling - methods of nonlinear regression and in silico prediction modelling; chemometrics and statistical analysis of the data; single-step extraction methods, and lysimetry) were applied. In the first step of thesis solution, alternative sorbents of biological origin (biomass of microalgae, freshwater mosses, and waste biomass of hop) were obtained and physico-chemically characterized mainly in order to prediction of sorption capacities of Cd and synthetic dyes thioflavine T (TT), malachite green (MG) or methylene blue (MB) removal from single component or binary aqueous solutions and under conditions of batch or continuous flow systems. For these purposes, mathematical models of adsorption isotherms and models originated from chromatographic separation methods by application of methods of nonlinear regression analysis were used. In the second part of the work, methods of multivariate analysis in the evaluation of processes of synthetic dyes TT and MB binding in terms of the finding of relationships between sorption-desorption variables describing the stability of the bond and parameters defining the physic-chemical properties of river sediments and the environment of real or model waters were applied. In the last part of the work, a special laboratory lysimeter system was designed and applied within the soil biosystem defined by: soil additive (SA) derived from sewage sludge representing the source of microelements Zn and Cu <-> agriculturally used soil <-> soil solution <-> root

  3. Characterization of binding and mobility of metals and xenobiotics in continuous flow and soil biosystems

    International Nuclear Information System (INIS)

    Sunovska, A.

    2016-01-01

    The main aim of the dissertation thesis was to contribute to development of analytical tools and approaches application in characterization of binding and mobility of heavy metals and organic compounds (xenobiotics) in continuous flow and soil biosystems. Within the solution of this aim, a wide range of analytical methods (gamma-spectrometry, UV-VIS spectrophotometry, AAS, X-ray fluorescence spectrometry, ion chromatography, and stripping volt-amperometry) and approaches (mathematical modelling - methods of nonlinear regression and in silico prediction modelling; chemometrics and statistical analysis of the data; single-step extraction methods, and lysimetry) were applied. In the first step of thesis solution, alternative sorbents of biological origin (biomass of microalgae, freshwater mosses, and waste biomass of hop) were obtained and physico-chemically characterized mainly in order to prediction of sorption capacities of Cd and synthetic dyes thioflavine T (TT), malachite green (MG) or methylene blue (MB) removal from single component or binary aqueous solutions and under conditions of batch or continuous flow systems. For these purposes, mathematical models of adsorption isotherms and models originated from chromatographic separation methods by application of methods of nonlinear regression analysis were used. In the second part of the work, methods of multivariate analysis in the evaluation of processes of synthetic dyes TT and MB binding in terms of the finding of relationships between sorption-desorption variables describing the stability of the bond and parameters defining the physic-chemical properties of river sediments and the environment of real or model waters were applied. In the last part of the work, a special laboratory lysimeter system was designed and applied within the soil biosystem defined by: soil additive (SA) derived from sewage sludge representing the source of microelements Zn and Cu agriculturally used soil soil solution root system of

  4. Temperature variation in metal ceramic technology analyzed using time domain optical coherence tomography

    Science.gov (United States)

    Sinescu, Cosmin; Topala, Florin I.; Negrutiu, Meda Lavinia; Duma, Virgil-Florin; Podoleanu, Adrian G.

    2014-01-01

    The quality of dental prostheses is essential in providing good quality medical services. The metal ceramic technology applied in dentistry implies ceramic sintering inside the dental oven. Every ceramic material requires a special sintering chart which is recommended by the producer. For a regular dental technician it is very difficult to evaluate if the temperature inside the oven remains the same as it is programmed on the sintering chart. Also, maintaining the calibration in time is an issue for the practitioners. Metal ceramic crowns develop a very accurate pattern for the ceramic layers depending on the temperature variation inside the oven where they are processed. Different patterns were identified in the present study for the samples processed with a variation in temperature of +30 °C to +50 °C, respectively - 30 0°C to -50 °C. The OCT imagistic evaluations performed for the normal samples present a uniform spread of the ceramic granulation inside the ceramic materials. For the samples sintered at a higher temperature an alternation between white and darker areas between the enamel and opaque layers appear. For the samples sintered at a lower temperature a decrease in the ceramic granulation from the enamel towards the opaque layer is concluded. The TD-OCT methods can therefore be used efficiently for the detection of the temperature variation due to the ceramic sintering inside the ceramic oven.

  5. Proceedings of the 8. International Symposium on Microbial Ecology : microbial biosystems : new frontiers

    International Nuclear Information System (INIS)

    Bell, C.R.; Brylinsky, M.; Johnson-Green, P.

    2000-01-01

    A wide range of disciplines were presented at this conference which reflected the importance of microbial ecology and provided an understanding of the factors that determine the growth and activities of microorganisms. The conference attracted 1444 delegates from 54 countries. The research emerging from the rapidly expanding frontier of microbial ecosystems was presented in 62 oral presentation and 817 poster presentations. The two volumes of these proceedings presented a total of 27 areas in microbial ecology, some of which included terrestrial biosystems, aquatic, estuarine, surface and subsurface microbial ecology. Other topics included bioremediation, microbial ecology in industry and microbial ecology of oil fields. Some of the papers highlighted the research that is underway to determine the feasibility of using microorganisms for enhanced oil recovery (EOR). Research has shown that microbial EOR can increase production at lower costs than conventional oil recovery. The use of bacteria has also proven to be a feasible treatment method in the biodegradation of hydrocarbons associated with oil spills. refs., tabs., figs

  6. Electro-Quasistatic Simulations in Bio-Systems Engineering and Medical Engineering

    Directory of Open Access Journals (Sweden)

    U. van Rienen

    2005-01-01

    Full Text Available Slowly varying electromagnetic fields play a key role in various applications in bio-systems and medical engineering. Examples are the electric activity of neurons on neurochips used as biosensors, the stimulating electric fields of implanted electrodes used for deep brain stimulation in patients with Morbus Parkinson and the stimulation of the auditory nerves in deaf patients, respectively. In order to simulate the neuronal activity on a chip it is necessary to couple Maxwell's and Hodgkin-Huxley's equations. First numerical results for a neuron coupling to a single electrode are presented. They show a promising qualitative agreement with the experimentally recorded signals. Further, simulations are presented on electrodes for deep brain stimulation in animal experiments where the question of electrode ageing and energy deposition in the surrounding tissue are of major interest. As a last example, electric simulations for a simple cochlea model are presented comparing the field in the skull bones for different electrode types and stimulations in different positions.

  7. Assessing the Validity of Using Serious Game Technology to Analyze Physician Decision Making

    Science.gov (United States)

    Mohan, Deepika; Angus, Derek C.; Ricketts, Daniel; Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R.; Yealy, Donald M.; Barnato, Amber E.

    2014-01-01

    Background Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity. Methods We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. Findings We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, pdecisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the use of heuristics, as predicted by cognitive theory. PMID:25153149

  8. Multi-sensor technologies for analyzing sinkholes in Hamedan, west Iran

    Science.gov (United States)

    Vajedian, Sanaz; Motagh, Mahdi; Hojati, Ahmad; Wetzel, Hans-Ulrich

    2017-04-01

    Dissolution of the carbonate beds such as limestone, dolomite or gypsum by acidic groundwater flowing through fractures and joints in the bedrock alters land surface and enhances the development of sinkholes. Sinkhole formation causes the surface to subside or even collapse suddenly without any prior warning, leading to extensive damage and sometimes loss of life and property, in particular in urban areas. Delineating sinkholes is critical for understanding hydrological processes and mitigating geological hazards in karst areas. The recent availability of high-resolution digital elevation models (DEM) from TanDEM-X (TDX) mission enables us to delineate and analyze geomorphologic features and landscape structures at an unprecedented level of details, in comparison to previous missions such as c-band and x-band Shuttle Radar Topography Mission (SRTM). In this study, we develop an adaptive sinkhole-delineating method based on photogrammetry techniques to detect karst sinkholes in Hamedan , west Iran, using TDX-derived DEMs. We apply automatic feature extraction using watershed algorithm in order to detect depression areas. We show that using high-resolution TDX data from different geometries and time periods we could effectively distinguish sinkholes from other depression features of the basin. We also use interferometric synthetic aperture radar (InSAR) technique with SAR data acquired from a variety of sensors including Envisat, ALOS, TerraSAR-X and Sentinel-1 to quantify long-term subsidence in areas prone to sinkhole formation. Our results indicate that the formation of a lot of sinkholes is influenced by land subsidence, affecting the region over 100 km with the maximum rate of 4-5 cm/yr during 2003 to 2016.

  9. Assessing the validity of using serious game technology to analyze physician decision making.

    Science.gov (United States)

    Mohan, Deepika; Angus, Derek C; Ricketts, Daniel; Farris, Coreen; Fischhoff, Baruch; Rosengart, Matthew R; Yealy, Donald M; Barnato, Amber E

    2014-01-01

    Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity. We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, prepresentative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20), but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03). We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the use of heuristics, as predicted by cognitive theory.

  10. Assessing the validity of using serious game technology to analyze physician decision making.

    Directory of Open Access Journals (Sweden)

    Deepika Mohan

    Full Text Available Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes have emerged as a method of studying physician decision making. However, little is known about their validity.We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines. We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case. We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases.We recruited 209 physicians, of whom 168 (79% began and 142 (68% completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C: 10.9 [SD 4.8] vs. cognitive load (CL:10.7 [SD 5.6], p = 0.74, despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01. Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20, but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03.We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased

  11. Influence of capillary barrier effect on biogas distribution at the base of passive methane oxidation biosystems: Parametric study.

    Science.gov (United States)

    Ahoughalandari, Bahar; Cabral, Alexandre R

    2017-05-01

    The efficiency of methane oxidation in passive methane oxidation biosystems (PMOBs) is influenced by, among other things, the intensity and distribution of the CH 4 loading at the base of the methane oxidation layer (MOL). Both the intensity and distribution are affected by the capillary barrier that results from the superposition of the two materials constituting the PMOB, namely the MOL and the gas distribution layer (GDL). The effect of capillary barriers on the unsaturated flow of water has been well documented in the literature. However, its effect on gas flow through PMOBs is still poorly documented. In this study, sets of numerical simulations were performed to evaluate the effect of unsaturated hydraulic characteristics of the MOL material on the value and distribution of moisture and hence, the ease and uniformity in the distribution of the upward flow of biogas along the GDL-MOL interface. The unsaturated hydraulic parameters of the materials used to construct the experimental field plot at the St-Nicephore landfill (Quebec, Canada) were adopted to build the reference simulation of the parametric study. The behavior of the upward flow of biogas for this particular material was analyzed based on its gas intrinsic permeability function, which was obtained in the laboratory. The parameters that most influenced the distribution and the ease of biogas flow at the base of the MOL were the saturated hydraulic conductivity and pore size distribution of the MOL material, whose effects were intensified as the slope of the interface increased. The effect of initial dry density was also assessed herein. Selection of the MOL material must be made bearing in mind that these three parameters are key in the effort to prevent unwanted restriction in the upward flow of biogas, which may result in the redirection of biogas towards the top of the slope, leading to high CH 4 fluxes (hotspots). In a well-designed PMOB, upward flow of biogas across the GDL-MOL interface is

  12. Analyzing the effect of customer loyalty on virtual marketing adoption based on theory of technology acceptance model

    Directory of Open Access Journals (Sweden)

    Peyman Ghafari Ashtiani

    2016-08-01

    Full Text Available One of the most advantages of the internet and its expansion is probably due to its easy and low cost access to unlimited information and easy and fast information exchange. The accession of communication technology for marketing area and emergence of the Internet leads to creation and development of new marketing models such as viral marketing. In fact, unlike other marketing methods, the most powerful tool for selling products and ideas are not done by a marketer to a customer but from a customer to another one. The purpose of this research is to analyze the relationship between customers' loyalty and the acceptance of viral marketing based on the theory of technology acceptance model (TAM model among the civil engineers and architects who are the members of Engineering Council in Isfahan (ECI. The research method is descriptive–survey and it is applicable in target. The statistical population includes civil engineers and architects who are the members of Engineering Council in Isfahan including 14400 members. The sample size was determined 762 members based on Cochran sampling formula, the sample was selected as accessible. The data was collected by field method. Analyzing the data and recent research hypothesis, the data was extracted from the questionnaires. Then, all the data was analyzed by computer and SPSS and LISREL software. According to the results of the data, the loyalty of the civil engineers and architects members of ECI was associated with the acceptance and practical involvement of viral marketing.

  13. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program

    OpenAIRE

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: “Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?” Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), ...

  14. Validation of the Applied Biosystems RapidFinder Shiga Toxin-Producing E. coli (STEC) Detection Workflow.

    Science.gov (United States)

    Cloke, Jonathan; Matheny, Sharon; Swimley, Michelle; Tebbs, Robert; Burrell, Angelia; Flannery, Jonathan; Bastin, Benjamin; Bird, Patrick; Benzinger, M Joseph; Crowley, Erin; Agin, James; Goins, David; Salfinger, Yvonne; Brodsky, Michael; Fernandez, Maria Cristina

    2016-11-01

    The Applied Biosystems™ RapidFinder™ STEC Detection Workflow (Thermo Fisher Scientific) is a complete protocol for the rapid qualitative detection of Escherichia coli (E. coli) O157:H7 and the "Big 6" non-O157 Shiga-like toxin-producing E. coli (STEC) serotypes (defined as serogroups: O26, O45, O103, O111, O121, and O145). The RapidFinder STEC Detection Workflow makes use of either the automated preparation of PCR-ready DNA using the Applied Biosystems PrepSEQ™ Nucleic Acid Extraction Kit in conjunction with the Applied Biosystems MagMAX™ Express 96-well magnetic particle processor or the Applied Biosystems PrepSEQ Rapid Spin kit for manual preparation of PCR-ready DNA. Two separate assays comprise the RapidFinder STEC Detection Workflow, the Applied Biosystems RapidFinder STEC Screening Assay and the Applied Biosystems RapidFinder STEC Confirmation Assay. The RapidFinder STEC Screening Assay includes primers and probes to detect the presence of stx1 (Shiga toxin 1), stx2 (Shiga toxin 2), eae (intimin), and E. coli O157 gene targets. The RapidFinder STEC Confirmation Assay includes primers and probes for the "Big 6" non-O157 STEC and E. coli O157:H7. The use of these two assays in tandem allows a user to detect accurately the presence of the "Big 6" STECs and E. coli O157:H7. The performance of the RapidFinder STEC Detection Workflow was evaluated in a method comparison study, in inclusivity and exclusivity studies, and in a robustness evaluation. The assays were compared to the U.S. Department of Agriculture (USDA), Food Safety and Inspection Service (FSIS) Microbiology Laboratory Guidebook (MLG) 5.09: Detection, Isolation and Identification of Escherichia coli O157:H7 from Meat Products and Carcass and Environmental Sponges for raw ground beef (73% lean) and USDA/FSIS-MLG 5B.05: Detection, Isolation and Identification of Escherichia coli non-O157:H7 from Meat Products and Carcass and Environmental Sponges for raw beef trim. No statistically significant

  15. Analyzing Department of Defense's Use of Other Transactions as a Method for Accessing Non-Traditional Technology

    National Research Council Canada - National Science Library

    Gilliland, John

    2001-01-01

    ... technological superiority To attract advanced technology companies that normally do not participate in defense business to the defense market, Congress provided a new contracting authority, Section 845...

  16. Engaging Middle School Students with Google Earth Technology to Analyze Ocean Cores as Evidence for Sea Floor Spreading

    Science.gov (United States)

    Prouhet, T.; Cook, J.

    2006-12-01

    Google Earth's ability to captivate students' attention, its ease of use, and its high quality images give it the potential to be an extremely effective tool for earth science educators. The unique properties of Google Earth satisfy a growing demand to incorporate technology in science instruction. Google Earth is free and relatively easy to use unlike some other visualization software. Students often have difficulty conceptualizing and visualizing earth systems, such as deep-ocean basins, because of the complexity and dynamic nature of the processes associated with them (e.g. plate tectonics). Google Earth's combination of aerial photography, satellite images and remote sensing data brings a sense of realism to science concepts. The unobstructed view of the ocean floor provided by this technology illustrates three-dimensional subsurface features such as rift valleys, subduction zones, and sea-mounts enabling students to better understand the seafloor's dynamic nature. Students will use Google Earth to navigate the sea floor, and examine Deep Sea Drilling Project (DSDP) core locations the from the Glomar Challenger Leg 3 expedition. The lesson to be implemented was expanded upon and derived from the Joint Oceanographic Insitute (JOI) Learning exercise, Nannofossils Reveal Seafloor Spreading. In addition, students take on the role of scientists as they graph and analyze paleontological data against the distance from the Mid Ocean Ridge. The integration of ocean core data in this three-dimensional view aids students' ability to draw and communicate valid conclusions about their scientific observations. A pre and post survey will be given to examine attitudes, self-efficacy, achievement and content mastery to a sample of approximately 300 eighth grade science students. The hypothesis is that the integration of Google Earth will significantly improve all areas of focus as mentioned above.

  17. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program.

    Science.gov (United States)

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger.

  18. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  19. Analyzing the Discourse of Chais Conferences for the Study of Innovation and Learning Technologies via a Data-Driven Approach

    Science.gov (United States)

    Silber-Varod, Vered; Eshet-Alkalai, Yoram; Geri, Nitza

    2016-01-01

    The current rapid technological changes confront researchers of learning technologies with the challenge of evaluating them, predicting trends, and improving their adoption and diffusion. This study utilizes a data-driven discourse analysis approach, namely culturomics, to investigate changes over time in the research of learning technologies. The…

  20. Can Vrancea earthquakes be accurately predicted from unusual bio-system behavior and seismic-electromagnetic records?

    International Nuclear Information System (INIS)

    Enescu, D.; Chitaru, C.; Enescu, B.D.

    1999-01-01

    The relevance of bio-seismic research for the short-term prediction of strong Vrancea earthquakes is underscored. An unusual animal behavior before and during Vrancea earthquakes is described and illustrated in the individual case of the major earthquake of March 4, 1977. Several hypotheses to account for the uncommon behavior of bio-systems in relation to earthquakes in general and strong Vrancea earthquakes in particular are discussed in the second section. It is reminded that promising preliminary results concerning the identification of seismic-electromagnetic precursor signals have been obtained in the Vrancea seismogenic area using special, highly sensitive equipment. The need to correlate bio-seismic and seismic-electromagnetic researches is evident. Further investigations are suggested and urgent steps are proposed in order to achieve a successful short-term prediction of strong Vrancea earthquakes. (authors)

  1. Climate technology transfer at the local, national and global levels: analyzing the relationships between multi-level structures

    NARCIS (Netherlands)

    Tessema Abissa, Fisseha; Tessema Abissa, Fisseha

    2014-01-01

    This thesis examines the relationships between multi-leveled decision structures for climate technology transfer through an analysis of top-down macro-policy and bottom-up micro-implementation. It examines how international climate technology transfer policy under the UNFCCC filters down to the

  2. Does environmental impact assessment really support technological change? Analyzing alternatives to coal-fired power stations in Denmark

    International Nuclear Information System (INIS)

    Lund, H.; Hvelplund, F.

    1997-01-01

    Danish energy policy calls for development of decentralized, cleaner technologies to replace conventional power stations and, since 1990, aims to reduce CO 2 emissions in 2005 to 20% below their 1988 level. These political goals require a technological change, from conventional, central power stations to cleaner, decentralized technologies such as energy conservation, cogeneration, and renewable energy. In principle, environmental impact assessment (EIA) supports this change on a project-by-project basis. The EU directive on EIA was based on the preventive principle: to eliminate pollution source rather than attempting to counteract it subsequently. According to the Danish implementation of the directive, an EIA must review a project's main alternatives and the environmental consequences of the alternatives. If this were done properly, EIAs could assist Denmark in meeting its CO 2 reduction goals. However, because EIA is implemented on a restricted, regional basis, it does not support technological change. Responsibility for the preparation of the EIA is given to the regional authorities through a law which does not require alternatives to be assessed that extend geographically beyond the boundaries of a regional authority. Thus, there is no certainty of serious analysis of cleaner technology alternatives to large coal-fired power stations. This conclusion is based on examination of three case studies using a participatory research method

  3. Ab initio O(N) elongation-counterpoise method for BSSE-corrected interaction energy analyses in biosystems

    Energy Technology Data Exchange (ETDEWEB)

    Orimoto, Yuuichi; Xie, Peng; Liu, Kai [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Yamamoto, Ryohei [Department of Molecular and Material Sciences, Interdisciplinary Graduate School of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Imamura, Akira [Hiroshima Kokusai Gakuin University, 6-20-1 Nakano, Aki-ku, Hiroshima 739-0321 (Japan); Aoki, Yuriko, E-mail: aoki.yuriko.397@m.kyushu-u.ac.jp [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012 (Japan)

    2015-03-14

    An Elongation-counterpoise (ELG-CP) method was developed for performing accurate and efficient interaction energy analysis and correcting the basis set superposition error (BSSE) in biosystems. The method was achieved by combining our developed ab initio O(N) elongation method with the conventional counterpoise method proposed for solving the BSSE problem. As a test, the ELG-CP method was applied to the analysis of the DNAs’ inter-strands interaction energies with respect to the alkylation-induced base pair mismatch phenomenon that causes a transition from G⋯C to A⋯T. It was found that the ELG-CP method showed high efficiency (nearly linear-scaling) and high accuracy with a negligibly small energy error in the total energy calculations (in the order of 10{sup −7}–10{sup −8} hartree/atom) as compared with the conventional method during the counterpoise treatment. Furthermore, the magnitude of the BSSE was found to be ca. −290 kcal/mol for the calculation of a DNA model with 21 base pairs. This emphasizes the importance of BSSE correction when a limited size basis set is used to study the DNA models and compare small energy differences between them. In this work, we quantitatively estimated the inter-strands interaction energy for each possible step in the transition process from G⋯C to A⋯T by the ELG-CP method. It was found that the base pair replacement in the process only affects the interaction energy for a limited area around the mismatch position with a few adjacent base pairs. From the interaction energy point of view, our results showed that a base pair sliding mechanism possibly occurs after the alkylation of guanine to gain the maximum possible number of hydrogen bonds between the bases. In addition, the steps leading to the A⋯T replacement accompanied with replications were found to be unfavorable processes corresponding to ca. 10 kcal/mol loss in stabilization energy. The present study indicated that the ELG-CP method is promising for

  4. Advantages of using ‘omics’ technologies and bioinformatics for analyzing the impact of pathogens on sugarbeet

    Science.gov (United States)

    Throughout the history of American sugarbeet production, research has proceeded hand-in-hand with the emergence of new diseases, and sugarbeet scientists have used the technologies available to improve disease management and crop yield in the face of the emerging disease pressures. Many traditional...

  5. Does Personality Matter? Applying Holland's Typology to Analyze Students' Self-Selection into Science, Technology, Engineering, and Mathematics Majors

    Science.gov (United States)

    Chen, P. Daniel; Simpson, Patricia A.

    2015-01-01

    This study utilized John Holland's personality typology and the Social Cognitive Career Theory (SCCT) to examine the factors that may affect students' self-selection into science, technology, engineering, and mathematics (STEM) majors. Results indicated that gender, race/ethnicity, high school achievement, and personality type were statistically…

  6. Collect, analyze and data base for building up the investment reports of Center for Nuclear Science and Technology construction project

    International Nuclear Information System (INIS)

    Pham Quang Minh; Tran Chi Thanh; Cao Dinh Thanh; Mai Dinh Trung; Hoang Sy Than; Nguyen Nhi Dien; Trinh Van Giap; Le Ba Thuan; Vu Tien Ha

    2014-01-01

    Following the Contract No.19/HD/NVCB dated July 10, 2013 signed by the President of Vietnam Atomic Energy Institute (VINATOM), an additional ministerial Project was approval by the Decision No. 526/QD-VNLNT dated July 8, 2013 by the VINATOM President in order to implement an important task for VINATOM. This project was implemented by the Institute for Nuclear Science and Technology (INST) in Hanoi as management organization and VINATOM as the owner of project results. Main objectives of this Project are to support national budget for implementing to collected the general report from previous projects which are relevant to CNEST and new research reactor, IAEA guidance documents, documents provided by ROSATOM in seminars in 2010, 2012 and 2013, report from expert visits of Ministry of Science and Technology and completed the general report about the construction project of CNEST. (author)

  7. The impact of management,technology and finance on export performance : analyzing the garment industry in Bangladesh

    OpenAIRE

    Muktadir, Zahid

    2012-01-01

    Masteroppgave i økonomi og administrasjon - Universitetet i Agder 2012 Drawing on resources-base view theory and approaching from quantitative field work, this study examines the three factors influencing the export performances of Bangladeshi Readymade Garments industries: Financial resources, Technology and managerial skill. This study also investigates the role of mediating effect in the relationship between factors and export performance. Most of the previous studies about export perfo...

  8. An Update on Analyzing Differences Between Public and Private Sector Information Resource Management: Strategic Information Challenges and Critical Technologies

    Science.gov (United States)

    2004-06-01

    ecommerce architecture is the business—it is the company’s competitive advantage. (Morgan, 1998:40) Morgan goes on to illustrate this point by...396 JONES APPAREL GROUP INC 896 NATIONAL RURAL UTILITIES COOPERATIVE 397 COX COMMUNICATIONS INC 897 TRANS WORLD ENTERTAINMENT 398 MELLON FINANCIAL...It Can Work in the Public Sector,” MIS Quarterly, Dec:435-448, 1990. Morgan, T.P. “ Ecommerce Options,” Global Technology Business, Sept:40-42

  9. [New method for analyzing pharmacodynamic material basis of traditional Chinese medicines by using specific knockout technology with monoclonal antibodies].

    Science.gov (United States)

    Zhao, Yan; Qu, Hui-Hua; Wang, Qing-Guo

    2013-09-01

    Study on pharmacodynamic material basis of traditional Chinese medicines is one of the key issues for the modernization of traditional Chinese medicine. Having introduced the monoclonal antibody technology into the study on pharmacodynamic material basis of traditional Chinese medicines, the author prepared the immunoaffinity chromatography column by using monoclonal antibodies in active components of traditional Chinese medicines, so as to selectively knock out the component from herbs or traditional Chinese medicine compounds, while preserving all of the other components and keeping their amount and ratio unchanged. A comparative study on pharmacokinetics and pharmacodynamics was made to explicitly reveal the correlation between the component and the main purpose of traditional Chinese medicines and compounds. The analysis on pharmacodynamic material basis of traditional Chinese medicines by using specific knockout technology with monoclonal antibodies is a new method for study pharmacodynamic material basis in line with the characteristics of traditional Chinese medicines. Its results can not only help study material basis from a new perspective, but also help find the modern scientific significance in single herb or among compounds of traditional Chinese medicines.

  10. Analyzing the Effect of Technology-Based Intervention in Language Laboratory to Improve Listening Skills of First Year Engineering Students

    Directory of Open Access Journals (Sweden)

    Pasupathi Madhumathi

    2013-04-01

    Full Text Available First year students pursuing engineering education face problems with their listening skills. Most of the Indian schools use a bilingual method for teaching subjects from primary school through high school. Nonetheless, students entering university education develop anxiety in listening to classroomlectures in English. This article reports an exploratory study that aimed to find out whether the listening competences of students improved when technology was deployed in language laboratory. It also investigated the opinions of the students about using teacher-suggested websites for acquiring listening skills. The results of the study indicated that the use of technology in a language laboratory for training students in listening competences had reduced the anxiety of the students when listening to English. Further, there was a significant improvement on the part of students in acquiring listening skills through technology-based intervention.Muchos estudiantes de ingeniería de primer año en India tienen problemas con sus habilidades de escucha en inglés; experimentan ansiedad al momento de escuchar conferencias en inglés, pese a que provienen de colegios donde se sigue un modelo bilingüe para enseñar materias desde la primariahasta la secundaria. Con el objetivo de averiguar si las competencias de escucha de los estudiantes mejoran cuando se introduce la tecnología en el laboratorio de idiomas, se realizó un estudio exploratorio en el que se tuvieron en cuenta las opiniones de los estudiantes acerca del uso de sitios web sugeridos por el docente para adquirir habilidades de escucha. Los resultados indican que el uso de la tecnología en el laboratorio de idiomas reduce la ansiedad de los estudiantes al momento de escuchar conferencias en inglés y que progresan significativamente en sus habilidades de escucha.

  11. Seismic Technology Adapted to Analyzing and Developing Geothermal Systems Below Surface-Exposed High-Velocity Rocks Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; DeAngelo, Michael V. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Ermolaeva, Elena [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Remington, Randy [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Sava, Diana [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wagner, Donald [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wei, Shuijion [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology

    2013-02-01

    The objective of our research was to develop and demonstrate seismic data-acquisition and data-processing technologies that allow geothermal prospects below high-velocity rock outcrops to be evaluated. To do this, we acquired a 3-component seismic test line across an area of exposed high-velocity rocks in Brewster County, Texas, where there is high heat flow and surface conditions mimic those found at numerous geothermal prospects. Seismic contractors have not succeeded in creating good-quality seismic data in this area for companies who have acquired data for oil and gas exploitation purposes. Our test profile traversed an area where high-velocity rocks and low-velocity sediment were exposed on the surface in alternating patterns that repeated along the test line. We verified that these surface conditions cause non-ending reverberations of Love waves, Rayleigh waves, and shallow critical refractions to travel across the earth surface between the boundaries of the fast-velocity and slow-velocity material exposed on the surface. These reverberating surface waves form the high level of noise in this area that does not allow reflections from deep interfaces to be seen and utilized. Our data-acquisition method of deploying a box array of closely spaced geophones allowed us to recognize and evaluate these surface-wave noise modes regardless of the azimuth direction to the surface anomaly that backscattered the waves and caused them to return to the test-line profile. With this knowledge of the surface-wave noise, we were able to process these test-line data to create P-P and SH-SH images that were superior to those produced by a skilled seismic data-processing contractor. Compared to the P-P data acquired along the test line, the SH-SH data provided a better detection of faults and could be used to trace these faults upward to the boundaries of exposed surface rocks. We expanded our comparison of the relative value of S-wave and P-wave seismic data for geothermal

  12. Enabling personalized implant and controllable biosystem development through 3D printing.

    Science.gov (United States)

    Nagarajan, Neerajha; Dupret-Bories, Agnes; Karabulut, Erdem; Zorlutuna, Pinar; Vrana, Nihal Engin

    The impact of additive manufacturing in our lives has been increasing constantly. One of the frontiers in this change is the medical devices. 3D printing technologies not only enable the personalization of implantable devices with respect to patient-specific anatomy, pathology and biomechanical properties but they also provide new opportunities in related areas such as surgical education, minimally invasive diagnosis, medical research and disease models. In this review, we cover the recent clinical applications of 3D printing with a particular focus on implantable devices. The current technical bottlenecks in 3D printing in view of the needs in clinical applications are explained and recent advances to overcome these challenges are presented. 3D printing with cells (bioprinting); an exciting subfield of 3D printing, is covered in the context of tissue engineering and regenerative medicine and current developments in bioinks are discussed. Also emerging applications of bioprinting beyond health, such as biorobotics and soft robotics, are introduced. As the technical challenges related to printing rate, precision and cost are steadily being solved, it can be envisioned that 3D printers will become common on-site instruments in medical practice with the possibility of custom-made, on-demand implants and, eventually, tissue engineered organs with active parts developed with biorobotics techniques. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Global Impact of Energy Use in Middle East Oil Economies: A Modeling Framework for Analyzing Technology-Energy-Environment-Economy Chain

    OpenAIRE

    Hodjat Ghadimi

    2007-01-01

    To explore choices of improving energy efficiency in energy-rich countries of the Middle East, this study lays out an integrated modeling framework for analyzing the technology-energy-environment-economy chain for the case of an energy exporting country. This framework consists of an input output process-flow model (IOPM) and a computable general equilibrium (CGE) model. The former investigates the micro-level production processes and sectoral interdependencies to show how alternative technol...

  14. Provision of a draft version for standard classification structure for information of radiation technologies through analyzing their information and derivation of its applicable requirements to the information system

    International Nuclear Information System (INIS)

    Jang, Sol Ah; Kim, Joo Yeon; Yoo, Ji Yup; Shin, Woo Ho; Park, Tai Jin; Song, Myung Jae

    2015-01-01

    Radiation technology is the one for developing new products or processes by applying radiation or for creating new functions in industry, research and medical fields, and its application is increasing consistently. For securing an advanced technology competitiveness, it is required to create a new added value by information consumer through providing an efficient system for supporting information, which is the infrastructure for research and development, contributed to its collection, analysis and use with a rapidity and structure in addition to some direct research and development. Provision of the management structure for information resources is especially crucial for efficient operating the system for supporting information in radiation technology, and then a standard classification structure of information must be first developed as the system for supporting information will be constructed. The standard classification structure has been analyzed by reviewing the definition of information resources in radiation technology, and those classification structures in similar systems operated by institute in radiation and other scientific fields. And, a draft version of the standard classification structure has been then provided as 7 large, 25 medium and 71 small classifications, respectively. The standard classification structure in radiation technology will be developed in 2015 through reviewing this draft version and experts' opinion. Finally, developed classification structure will be applied to the system for supporting information by considering the plan for constructing this system and database, and requirements for designing the system. Furthermore, this structure will be designed in the system for searching information by working to the individual need of information consumers

  15. Provision of a draft version for standard classification structure for information of radiation technologies through analyzing their information and derivation of its applicable requirements to the information system

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sol Ah; Kim, Joo Yeon; Yoo, Ji Yup; Shin, Woo Ho; Park, Tai Jin; Song, Myung Jae [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-02-15

    Radiation technology is the one for developing new products or processes by applying radiation or for creating new functions in industry, research and medical fields, and its application is increasing consistently. For securing an advanced technology competitiveness, it is required to create a new added value by information consumer through providing an efficient system for supporting information, which is the infrastructure for research and development, contributed to its collection, analysis and use with a rapidity and structure in addition to some direct research and development. Provision of the management structure for information resources is especially crucial for efficient operating the system for supporting information in radiation technology, and then a standard classification structure of information must be first developed as the system for supporting information will be constructed. The standard classification structure has been analyzed by reviewing the definition of information resources in radiation technology, and those classification structures in similar systems operated by institute in radiation and other scientific fields. And, a draft version of the standard classification structure has been then provided as 7 large, 25 medium and 71 small classifications, respectively. The standard classification structure in radiation technology will be developed in 2015 through reviewing this draft version and experts' opinion. Finally, developed classification structure will be applied to the system for supporting information by considering the plan for constructing this system and database, and requirements for designing the system. Furthermore, this structure will be designed in the system for searching information by working to the individual need of information consumers.

  16. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  17. Greenhouse gas (GHG) emission in organic farming. Approximate quantification of its generation at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM)

    Science.gov (United States)

    Campos, Jorge; Barbado, Elena; Maldonado, Mariano; Andreu, Gemma; López de Fuentes, Pilar

    2016-04-01

    As it well-known, agricultural soil fertilization increases the rate of greenhouse gas (GHG) emission production such as CO2, CH4 and N2O. Participation share of this activity on the climate change is currently under study, as well as the mitigation possibilities. In this context, we considered that it would be interesting to know how this share is in the case of organic farming. In relation to this, a field experiment was carried out at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM). The orchard included different management growing areas, corresponding to different schools of organic farming. Soil and gas samples were taken from these different sites. Gas samples were collected throughout the growing season from an accumulated atmosphere inside static chambers inserted into the soil. Then, these samples were carried to the laboratory and there analyzed. The results obtained allow knowing approximately how ecological fertilization contributes to air pollution due to greenhouse gases.

  18. An application of multiplier analysis in analyzing the role of information and communication technology sectors on Indonesian national economy: 1990-2005

    International Nuclear Information System (INIS)

    Zuhdi, Ubaidillah

    2015-01-01

    The purpose of this study is to continue the previous studies which focused on Indonesian Information and Communication Technology (ICT) sectors. More specifically, this study aims to analyze the role of ICT sectors on Indonesian national economy using simple household income multiplier, one of the analysis tools in Input-Output (IO) analysis. The analysis period of this study is from 1990-2005. The results show that the sectors did not have an important role on the national economy of Indonesia on the period. Besides, the results also show that, from the point of view of the multiplier, Indonesian national economy tended to stable during the period

  19. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  20. The use of real-time cell analyzer technology in drug discovery: defining optimal cell culture conditions and assay reproducibility with different adherent cellular models.

    Science.gov (United States)

    Atienzar, Franck A; Tilmant, Karen; Gerets, Helga H; Toussaint, Gaelle; Speeckaert, Sebastien; Hanon, Etienne; Depelchin, Olympe; Dhalluin, Stephane

    2011-07-01

    The use of impedance-based label-free technology applied to drug discovery is nowadays receiving more and more attention. Indeed, such a simple and noninvasive assay that interferes minimally with cell morphology and function allows one to perform kinetic measurements and to obtain information on proliferation, migration, cytotoxicity, and receptor-mediated signaling. The objective of the study was to further assess the usefulness of a real-time cell analyzer (RTCA) platform based on impedance in the context of quality control and data reproducibility. The data indicate that this technology is useful to determine the best coating and cellular density conditions for different adherent cellular models including hepatocytes, cardiomyocytes, fibroblasts, and hybrid neuroblastoma/neuronal cells. Based on 31 independent experiments, the reproducibility of cell index data generated from HepG2 cells exposed to DMSO and to Triton X-100 was satisfactory, with a coefficient of variation close to 10%. Cell index data were also well reproduced when cardiomyocytes and fibroblasts were exposed to 21 compounds three times (correlation >0.91, p technology appears to be a powerful and reliable tool in drug discovery because of the reasonable throughput, rapid and efficient performance, technical optimization, and cell quality control.

  1. An art report to analyze research status for the establishment of the space food development and future food system using the advanced food technology

    International Nuclear Information System (INIS)

    Lee, Ju Woon; Byun, Myung Woo; Kim, Jae Hun

    2006-12-01

    The quality of food for the astronaut accomplishing the mission in the space is one of the most important matters, and it is time to study and develop Korean space food for the Korean astronaut in the space. Therefore, in the beginning of the space exploration era, it is necessary to establish a national long-term plan and study and develop Korean space food in order to provide food with better quality for the astronaut accomplishing the space mission. Using current food processing, preservation, and packaging technology, it is necessary to develop the Korean space food, provide Korean astronaut studying at the international space station, and study the future space food systems used for the long-term space voyage and planet habitat base in the era of space exploration. Space food is analyzed through nutritional analysis, sensory evaluation, storage studies, packaging evaluations, and many other methods before its final shipment on the space shuttle. Each technology developed for the advanced food system must provide the required attribute to the food system, including safety, nutrition, and acceptability. It is anticipated that the duration of the exploration class missions can be at least 2, 3 years, and one of the biggest challenges for these missions will be to provide acceptable food with a shelf-life of 3-5 years. The development of space food process/preservation technology and its ripple effect will make a contribution to the improvement of nation's international phase, and the developed space food will potentially be used for combat ration and emergency/special food like the U. S. A. In the 21th century of space exploration era, the development of the advanced food system and life support system in the Mars space base as well as the space shuttle will strengthen the capability to precede the future space exploration era

  2. An art report to analyze research status for the establishment of the space food development and future food system using the advanced food technology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ju Woon; Byun, Myung Woo; Kim, Jae Hun [KAERI, Daejeon (Korea, Republic of)

    2006-12-15

    The quality of food for the astronaut accomplishing the mission in the space is one of the most important matters, and it is time to study and develop Korean space food for the Korean astronaut in the space. Therefore, in the beginning of the space exploration era, it is necessary to establish a national long-term plan and study and develop Korean space food in order to provide food with better quality for the astronaut accomplishing the space mission. Using current food processing, preservation, and packaging technology, it is necessary to develop the Korean space food, provide Korean astronaut studying at the international space station, and study the future space food systems used for the long-term space voyage and planet habitat base in the era of space exploration. Space food is analyzed through nutritional analysis, sensory evaluation, storage studies, packaging evaluations, and many other methods before its final shipment on the space shuttle. Each technology developed for the advanced food system must provide the required attribute to the food system, including safety, nutrition, and acceptability. It is anticipated that the duration of the exploration class missions can be at least 2, 3 years, and one of the biggest challenges for these missions will be to provide acceptable food with a shelf-life of 3-5 years. The development of space food process/preservation technology and its ripple effect will make a contribution to the improvement of nation's international phase, and the developed space food will potentially be used for combat ration and emergency/special food like the U. S. A. In the 21th century of space exploration era, the development of the advanced food system and life support system in the Mars space base as well as the space shuttle will strengthen the capability to precede the future space exploration era

  3. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF TWO HYDROGEN SULFIDE ANALYZERS: HORIBA INSTRUMENTS, INC., APSA-360 AND TELEDYNE-API MODEL 101E

    Science.gov (United States)

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  4. How to Incorporate Technology with Inquiry-Based Learning to Enhance the Understanding of Chemical Composition; How to Analyze Unknown Samples

    Directory of Open Access Journals (Sweden)

    Suzanne Lunsford

    2017-02-01

    Full Text Available The use of technology in teaching offers numerous amounts of possibilities and can be challenging for physics, chemistry and geology content courses. When incorporating technology into a science content lab it is better to be driven by pedagogy than by technology in an inquiry-based lab setting. Students need to be introduced to real-world technology in the beginning of first year chemistry or physics course to ensure real-world technology concepts while assisting with content such as periodic trends on the periodic table. This article will describe the use of technology with Raman Spectroscopy and Energy Dispersive XRay Spectroscopy (EDS and Fourier Transform Infrared Spectroscopy (FTIR to research chemical compositions in the real world of unknown samples. Such unknown samples utilized in this lab were clamshell (parts of clams that look like shark teeth versus shark teeth. The data will be shared to show how the students (pre-service teachers and in-service teachers solved the problem using technology while learning important content that will assist in the next level of chemistry, physics and even geology.

  5. (Environmental technology)

    Energy Technology Data Exchange (ETDEWEB)

    Boston, H.L.

    1990-10-12

    The traveler participated in a conference on environmental technology in Paris, sponsored by the US Embassy-Paris, US Environmental Protection Agency (EPA), the French Environmental Ministry, and others. The traveler sat on a panel for environmental aspects of energy technology and made a presentation on the potential contributions of Oak Ridge National Laboratory (ORNL) to a planned French-American Environmental Technologies Institute in Chattanooga, Tennessee, and Evry, France. This institute would provide opportunities for international cooperation on environmental issues and technology transfer related to environmental protection, monitoring, and restoration at US Department of Energy (DOE) facilities. The traveler also attended the Fourth International Conference on Environmental Contamination in Barcelona. Conference topics included environmental chemistry, land disposal of wastes, treatment of toxic wastes, micropollutants, trace organics, artificial radionuclides in the environment, and the use biomonitoring and biosystems for environmental assessment. The traveler presented a paper on The Fate of Radionuclides in Sewage Sludge Applied to Land.'' Those findings corresponded well with results from studies addressing the fate of fallout radionuclides from the Chernobyl nuclear accident. There was an exchange of new information on a number of topics of interest to DOE waste management and environmental restoration needs.

  6. Waste Not, Want Not: Analyzing the Economic and Environmental Viability of Waste-to-Energy (WTE) Technology for Site-Specific Optimization of Renewable Energy Options

    Energy Technology Data Exchange (ETDEWEB)

    Funk, K.; Milford, J.; Simpkins, T.

    2013-02-01

    Waste-to-energy (WTE) technology burns municipal solid waste (MSW) in an environmentally safe combustion system to generate electricity, provide district heat, and reduce the need for landfill disposal. While this technology has gained acceptance in Europe, it has yet to be commonly recognized as an option in the United States. Section 1 of this report provides an overview of WTE as a renewable energy technology and describes a high-level model developed to assess the feasibility of WTE at a site. Section 2 reviews results from previous life cycle assessment (LCA) studies of WTE, and then uses an LCA inventory tool to perform a screening-level analysis of cost, net energy production, greenhouse gas (GHG) emissions, and conventional air pollution impacts of WTE for residual MSW in Boulder, Colorado. Section 3 of this report describes the federal regulations that govern the permitting, monitoring, and operating practices of MSW combustors and provides emissions limits for WTE projects.

  7. Context Matters: The Value of Analyzing Human Factors within Educational Contexts as a Way of Informing Technology-Related Decisions within Design Research

    Science.gov (United States)

    MacKinnon, Kim

    2012-01-01

    While design research can be useful for designing effective technology integrations within complex social settings, it currently fails to provide concrete methodological guidelines for gathering and organizing information about the research context, or for determining how such analyses ought to guide the iterative design and innovation process. A…

  8. Biosystems Study of the Molecular Networks Underlying Hippocampal Aging Progression and Anti-aging Treatment in Mice

    Directory of Open Access Journals (Sweden)

    Jiao Wang

    2017-12-01

    Full Text Available Aging progression is a process that an individual encounters as they become older, and usually results from a series of normal physiological changes over time. The hippocampus, which contributes to the loss of spatial and episodic memory and learning in older people, is closely related to the detrimental effects of aging at the morphological and molecular levels. However, age-related genetic changes in hippocampal molecular mechanisms are not yet well-established. To provide additional insight into the aging process, differentially-expressed genes of 3- versus 24- and 29-month old mice were re-analyzed. The results revealed that a large number of immune and inflammatory response-related genes were up-regulated in the aged hippocampus, and membrane receptor-associated genes were down-regulated. The down-regulation of transmembrane receptors may indicate the weaker perception of environmental exposure in older people, since many transmembrane proteins participate in signal transduction. In addition, molecular interaction analysis of the up-regulated immune genes indicated that the hub gene, Ywhae, may play essential roles in immune and inflammatory responses during aging progression, as well as during hippocampal development. Our biological experiments confirmed the conserved roles of Ywhae and its partners between human and mouse. Furthermore, comparison of microarray data between advanced-age mice treated with human umbilical cord blood plasma protein and the phosphate-buffered saline control showed that the genes that contribute to the revitalization of advanced-age mice are different from the genes induced by aging. These results implied that the revitalization of advanced-age mice is not a simple reverse process of normal aging progression. Our data assigned novel roles of genes during aging progression and provided further theoretic evidence for future studies exploring the underlying mechanisms of aging and anti-aging-related disease

  9. Analyzing the complexity of nanotechnology

    NARCIS (Netherlands)

    Vries, de M.J.; Schummer, J.; Baird, D.

    2006-01-01

    Nanotechnology is a highly complex technological development due to many uncertainties in our knowledge about it. The Dutch philosopher Herman Dooyeweerd has developed a conceptual framework that can be used (1) to analyze the complexity of technological developments and (2) to see how priorities

  10. Online Work Force Analyzes Social Media to Identify Consequences of an Unplanned School Closure – Using Technology to Prepare for the Next Pandemic

    Science.gov (United States)

    Rainey, Jeanette J.; Kenney, Jasmine; Wilburn, Ben; Putman, Ami; Zheteyeva, Yenlik; O’Sullivan, Megan

    2016-01-01

    Background During an influenza pandemic, the United States Centers for Disease Control and Prevention (CDC) may recommend school closures. These closures could have unintended consequences for students and their families. Publicly available social media could be analyzed to identify the consequences of an unplanned school closure. Methods As a proxy for an unplanned, pandemic-related school closure, we used the district-wide school closure due to the September 10–18, 2012 teachers’ strike in Chicago, Illinois. We captured social media posts about the school closure using the Radian6 social media-monitoring platform. An online workforce from Amazon Mechanical Turk categorized each post into one of two groups. The first group included relevant posts that described the impact of the closure on students and their families. The second group included irrelevant posts that described the political aspects of the strike or topics unrelated to the school closure. All relevant posts were further categorized as expressing a positive, negative, or neutral sentiment. We analyzed patterns of relevant posts and sentiment over time and compared our findings to household surveys conducted after other unplanned school closures. Results We captured 4,546 social media posts about the district-wide school closure using our search criteria. Of these, 930 (20%) were categorized as relevant by the online workforce. Of the relevant posts, 619 (67%) expressed a negative sentiment, 51 (5%) expressed a positive sentiment, and 260 (28%) were neutral. The number of relevant posts, and especially those with a negative sentiment, peaked on day 1 of the strike. Negative sentiment expressed concerns about childcare, missed school lunches, and the lack of class time for students. This was consistent with findings from previously conducted household surveys. Conclusion Social media are publicly available and can readily provide information on the impact of an unplanned school closure on students

  11. Online Work Force Analyzes Social Media to Identify Consequences of an Unplanned School Closure - Using Technology to Prepare for the Next Pandemic.

    Science.gov (United States)

    Rainey, Jeanette J; Kenney, Jasmine; Wilburn, Ben; Putman, Ami; Zheteyeva, Yenlik; O'Sullivan, Megan

    During an influenza pandemic, the United States Centers for Disease Control and Prevention (CDC) may recommend school closures. These closures could have unintended consequences for students and their families. Publicly available social media could be analyzed to identify the consequences of an unplanned school closure. As a proxy for an unplanned, pandemic-related school closure, we used the district-wide school closure due to the September 10-18, 2012 teachers' strike in Chicago, Illinois. We captured social media posts about the school closure using the Radian6 social media-monitoring platform. An online workforce from Amazon Mechanical Turk categorized each post into one of two groups. The first group included relevant posts that described the impact of the closure on students and their families. The second group included irrelevant posts that described the political aspects of the strike or topics unrelated to the school closure. All relevant posts were further categorized as expressing a positive, negative, or neutral sentiment. We analyzed patterns of relevant posts and sentiment over time and compared our findings to household surveys conducted after other unplanned school closures. We captured 4,546 social media posts about the district-wide school closure using our search criteria. Of these, 930 (20%) were categorized as relevant by the online workforce. Of the relevant posts, 619 (67%) expressed a negative sentiment, 51 (5%) expressed a positive sentiment, and 260 (28%) were neutral. The number of relevant posts, and especially those with a negative sentiment, peaked on day 1 of the strike. Negative sentiment expressed concerns about childcare, missed school lunches, and the lack of class time for students. This was consistent with findings from previously conducted household surveys. Social media are publicly available and can readily provide information on the impact of an unplanned school closure on students and their families. Using social media to

  12. Fiscal 1999 project for research and development of industrial and scientific technologies. Report on the achievements on the 'research and development of an ultimate atom and molecule manipulation technology' (Development of a technology to analyze and manipulate DNAs at high efficiency); 1999 nendo genshi bunshi kyokugen sosa gijutsu no kenkyu kaihatsu seika hokokusho. DNA nado kokoritsu kaiseki sosa gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    In the 'research and development of an ultimate atom and molecule manipulation technology', research has been made on an organic atom and molecule identification and manipulation technology and a dynamic organic molecule simulation technology. This paper summarizes the achievements in fiscal 1999. In the magnetic force controlling AFM for the force spectroscopy aimed at non-destructive atom and molecule identification, a prototype cantilever was fabricated that can excite and detect displacement in lateral direction and is suitable for friction measurement. The SrO surface and TiO2 surface of SrTiO{sub 3}. A carbon nano-tube was employed as a probe. In addition, the molecule inserting SAM technology was used to have developed a technology to measure electric conductivity inside and between molecules. With an aim at realizing a high-speed DNA base arrangement analyzing method, research is being performed upon noticing the single molecule method based on the light measuring method using the single molecule imaging as the base and the scanning probe microscope method. For the dynamic organic molecule simulation technology, theoretical analysis was advanced on synthesis of methanol on copper surface. (NEDO)

  13. THE USE OF GIS TECHNOLOGIES IN ANALYZING CHALLENGES AND OPPORTUNITIES FOR THE MANAGEMENT OF URBAN GREEN SPACES IN KIGALI CITY, RWANDA

    Directory of Open Access Journals (Sweden)

    G. Rwanyiziri

    2014-01-01

    Full Text Available This study was assessing the challenges and opportunities for managing Green Spaces (GS of the Urban Part of Kigali City (UPKC. To find out the GS classes and their threats, the land use classes were identified using GIS technologies. Its output was completed by the field visit, questionnaire survey, informal interviews and desk review of the existing environmental and biodiversity policies and laws. The land use assessment has shown that the built up areas is the most predominant and occupies 74.3%, while the green spaces occupy only 25.3% of the total areas of Urban Part of Kigali City (UPKC. Among the GS classes identified in UPKC, wetlands occupy about 62.6% of the total area of the GS, forests 25%, gardens that are combination of the road side trees, the roundabouts, and playgrounds occupy 12.4% of the total area of GS while the seasonal and perennial crops areas are not significant in the city. In addition, results have shown that GS play different roles in the city among others, the beautification of the city, the air purification and refreshment, waste water treatment, heat reduction, mind refreshment; act as habitat, food and corridors for a good number of animal, etc. Even though there is no specific law or policy to the urban GS management and protection, the Government of Rwanda (GoR has put in place a good number of opportunities that take them into consideration. Those include, (1 the governmental policies such as Environmental Policy, Biodiversity Policy, and Forest Policy; (2 the laws such as Organic Environmental Law and, (3 the plans such the master plans for the three districts that make Kigali City. Despite these opportunities, the management of GS in Kigali City is still facing some challenges that the Kigali City authorities are still trying to address. Those include the lack of policies on GS management, low level of awareness on GS management among local people, and the demographic pressure particularly caused by the

  14. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  15. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  16. A 40 GHz fully integrated circuit with a vector network analyzer and a coplanar-line-based detection area for circulating tumor cell analysis using 65 nm CMOS technology

    Science.gov (United States)

    Nakanishi, Taiki; Matsunaga, Maya; Kobayashi, Atsuki; Nakazato, Kazuo; Niitsu, Kiichi

    2018-03-01

    A 40-GHz fully integrated CMOS-based circuit for circulating tumor cells (CTC) analysis, consisting of an on-chip vector network analyzer (VNA) and a highly sensitive coplanar-line-based detection area is presented in this paper. In this work, we introduce a fully integrated architecture that eliminates unwanted parasitic effects. The proposed analyzer was designed using 65 nm CMOS technology, and SPICE and MWS simulations were used to validate its operation. The simulation confirmed that the proposed circuit can measure S-parameter shifts resulting from the addition of various types of tumor cells to the detection area, the data of which are provided in a previous study: the |S 21| values for HepG2, A549, and HEC-1-A cells are -0.683, -0.580, and -0.623 dB, respectively. Additionally, the measurement demonstrated an S-parameters reduction of -25.7% when a silicone resin was put on the circuit. Hence, the proposed system is expected to contribute to cancer diagnosis.

  17. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  18. Electron attachment analyzer

    International Nuclear Information System (INIS)

    Popp, P.; Grosse, H.J.; Leonhardt, J.; Mothes, S.; Oppermann, G.

    1984-01-01

    The invention concerns an electron attachment analyzer for detecting traces of electroaffine substances in electronegative gases, especially in air. The analyzer can be used for monitoring working places, e. g., in operating theatres. The analyzer consists of two electrodes inserted in a base frame of insulating material (quartz or ceramics) and a high-temperature resistant radiation source ( 85 Kr, 3 H, or 63 Ni)

  19. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, T.A.; Huestis, G.M.; Bolton, S.M.

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified off-the-shelf classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a hot cell (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  20. Remote Laser Diffraction PSD Analyzer

    International Nuclear Information System (INIS)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-01-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified ''off-the-shelf'' classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a ''hot cell'' (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable--making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives

  1. Nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Stritar, A.

    1986-01-01

    The development of Nuclear Power Plant Analyzers in USA is described. There are two different types of Analyzers under development in USA, the forst in Idaho and Los Alamos national Lab, the second in brookhaven National lab. That one is described in detail. The computer hardware and the mathematical models of the reactor vessel thermalhydraulics are described. (author)

  2. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  3. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of vari...

  4. Gearbox vibration diagnostic analyzer

    Science.gov (United States)

    1992-01-01

    This report describes the Gearbox Vibration Diagnostic Analyzer installed in the NASA Lewis Research Center's 500 HP Helicopter Transmission Test Stand to monitor gearbox testing. The vibration of the gearbox is analyzed using diagnostic algorithms to calculate a parameter indicating damaged components.

  5. Technology.

    Science.gov (United States)

    Online-Offline, 1998

    1998-01-01

    Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…

  6. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  7. Changes in intraocular pressure values measured with noncontact tonometer (NCT), ocular response analyzer (ORA) and corvis scheimpflug technology tonometer (CST) in the early phase after small incision lenticule extraction (SMILE).

    Science.gov (United States)

    Shen, Yang; Su, Xiangjian; Liu, Xiu; Miao, Huamao; Fang, Xuejun; Zhou, Xingtao

    2016-11-18

    Corneal biomechanical properties are always compromised after corneal refractive surgeries thus leading to underestimated intraocular pressure (IOP) that complicates the management of IOP. We investigated the changes in postoperative baseline of IOP values measured with noncontact tonometer (NCT), ocular response analyzer (ORA) and corvis scheimpflug technology (CST) in the early phase after small incision lenticule extraction (SMILE). Twenty-two eyes (-6.76 ± 1.39D) of 22 moderate and high myopes, (28.36 ± 7.14 years, 12 male and 10 female) were involved in this prospective study. IOP values were measured using a non-contact tomometer (NCT-IOP), an ocular response analyzer (corneal-compensated IOP, IOPcc and Goldmann-correlated IOP, IOPg) and a Corvis scheimpflug technology tonometer (CST-IOP) preoperatively, at 20 min and 24 h, postoperatively. Repeated measures analysis of variance (RM-ANOVA), Pearson's correlation analysis and multiple linear regression models (stepwise) were performed. Cut-off P values were 0.05. Except for IOPcc, NCT-IOP, IOPg, and CST-IOP values significantly decreased after SMILE procedure (All P values 0.05). Multiple linear regression models (stepwise) showed that the practical post-operative IOP value was the main predictor of the theoretical post-operative NCT-IOP, IOPcc and IOPg values (all P values <0.001). The postoperative applanation time 1 (AT1) value (B = 8.079, t = 4.866, P < 0.001), preoperative central corneal thickness (CCT) value (B = 0.035, t = 2.732, P = 0.014) and postoperative peak distance (PD) value (B = 0.515, t = 2.176, P = 0.043) were the main predictors of the theoretical post-operative CST-IOP value. IOP values are underestimated when assessed after SMILE by using NCT-IOP, IOPg and CST-IOP. The practical postoperative IOPcc value and theoretical post-operative CST-IOP value may be more preferable for IOP assessment in the early phase after SMILE. Current Controlled

  8. Technology

    Directory of Open Access Journals (Sweden)

    Xu Jing

    2016-01-01

    Full Text Available The traditional answer card reading method using OMR (Optical Mark Reader, most commonly, OMR special card special use, less versatile, high cost, aiming at the existing problems proposed a method based on pattern recognition of the answer card identification method. Using the method based on Line Segment Detector to detect the tilt of the image, the existence of tilt image rotation correction, and eventually achieve positioning and detection of answers to the answer sheet .Pattern recognition technology for automatic reading, high accuracy, detect faster

  9. Extraction spectrophotometric analyzer

    International Nuclear Information System (INIS)

    Batik, J.; Vitha, F.

    1985-01-01

    Automation is discussed of extraction spectrophotometric determination of uranium in a solution. Uranium is extracted from accompanying elements in an HCl medium with a solution of tributyl phosphate in benzene. The determination is performed by measuring absorbance at 655 nm in a single-phase ethanol-water-benzene-tributyl phosphate medium. The design is described of an analyzer consisting of an analytical unit and a control unit. The analyzer performance promises increased productivity of labour, improved operating and hygiene conditions, and mainly more accurate results of analyses. (J.C.)

  10. Achievement report for fiscal 1997. Technological development for practical application of a solar energy power generation system (development of technology to manufacture thin film solar cells (surveys and researches on analyzing practical application )). Volume 1; 1997 nendo taiyoko hatsuden system jitsuyoka gijutsu kaihatsu seika hokokusho. Usumaku taiyo denchi no seizo gijutsu kaihatsu (jitsuyoka kaiseki ni kansuru chosa kenkyu)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    It is intended to identify and analyze quickly and accurately the technological trends inside and outside the country on thin film solar cells, to reflect the results effectively on research and development of practical application of the thin film solar cells for power use, and to aid the research on practical application of the technology to manufacture the thin film solar cells. This fiscal year introduced the new project of researching and developing the poly-crystal silicon-based thin film solar cells. Discussions were given on designing the solar cells, including setting of thickness of an active layer required to improve efficiency of the silicon-based thin film solar cells, the light confining technology, and surface passivation. Comparisons and discussions were given on the new amorphous/poly-crystal silicon thin film manufacturing method and the conventional plasma CVD process. A research development program was introduced for a super laboratory to aid establishing the practical application technology for the silicon-based thin film solar cells. Chalcopyrite compounds including CuInSe2, and CdTe have not shown deterioration even in a long-term outdoor exposure test, hence they are noted as materials for high-efficiency solar cells and studied actively. Although still small in area, the net conversion efficiency was found in the order of 17%. Technological development has started to search mass production processes and commercialization possibility in the future. (NEDO)

  11. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  12. Americal options analyzed differently

    NARCIS (Netherlands)

    Nieuwenhuis, J.W.

    2003-01-01

    In this note we analyze in a discrete-time context and with a finite outcome space American options starting with the idea that every tradable should be a martingale under a certain measure. We believe that in this way American options become more understandable to people with a good working

  13. Analyzing Political Television Advertisements.

    Science.gov (United States)

    Burson, George

    1992-01-01

    Presents a lesson plan to help students understand that political advertisements often mislead, lie, or appeal to emotion. Suggests that the lesson will enable students to examine political advertisements analytically. Includes a worksheet to be used by students to analyze individual political advertisements. (DK)

  14. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  15. Soft Decision Analyzer

    Science.gov (United States)

    Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam

    2013-01-01

    We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.

  16. KWU Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Bennewitz, F.; Hummel, R.; Oelmann, K.

    1986-01-01

    The KWU Nuclear Plant Analyzer is a real time engineering simulator based on the KWU computer programs used in plant transient analysis and licensing. The primary goal is to promote the understanding of the technical and physical processes of a nuclear power plant at an on-site training facility. Thus the KWU Nuclear Plant Analyzer is available with comparable low costs right at the time when technical questions or training needs arise. This has been achieved by (1) application of the transient code NLOOP; (2) unrestricted operator interaction including all simulator functions; (3) using the mainframe computer Control Data Cyber 176 in the KWU computing center; (4) four color graphic displays controlled by a dedicated graphic computer, no control room equipment; and (5) coupling of computers by telecommunication via telephone

  17. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Oltulu, O.

    2004-01-01

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  18. Emission spectrometric isotope analyzer

    International Nuclear Information System (INIS)

    Mauersberger, K.; Meier, G.; Nitschke, W.; Rose, W.; Schmidt, G.; Rahm, N.; Andrae, G.; Krieg, D.; Kuefner, W.; Tamme, G.; Wichlacz, D.

    1982-01-01

    An emission spectrometric isotope analyzer has been designed for determining relative abundances of stable isotopes in gaseous samples in discharge tubes, in liquid samples, and in flowing gaseous samples. It consists of a high-frequency generator, a device for defined positioning of discharge tubes, a grating monochromator with oscillating slit and signal converter, signal generator, window discriminator, AND connection, read-out display, oscillograph, gas dosing device and chemical conversion system with carrier gas source and vacuum pump

  19. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility......Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  20. Electrodynamic thermogravimetric analyzer

    International Nuclear Information System (INIS)

    Spjut, R.E.; Bar-Ziv, E.; Sarofim, A.F.; Longwell, J.P.

    1986-01-01

    The design and operation of a new device for studying single-aerosol-particle kinetics at elevated temperatures, the electrodynamic thermogravimetric analyzer (EDTGA), was examined theoretically and experimentally. The completed device consists of an electrodynamic balance modified to permit particle heating by a CO 2 laser, temperature measurement by a three-color infrared-pyrometry system, and continuous weighing by a position-control system. In this paper, the position-control, particle-weight-measurement, heating, and temperature-measurement systems are described and their limitations examined

  1. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  2. Inductive dielectric analyzer

    International Nuclear Information System (INIS)

    Agranovich, Daniel; Popov, Ivan; Ben Ishai, Paul; Feldman, Yuri; Polygalov, Eugene

    2017-01-01

    One of the approaches to bypass the problem of electrode polarization in dielectric measurements is the free electrode method. The advantage of this technique is that, the probing electric field in the material is not supplied by contact electrodes, but rather by electromagnetic induction. We have designed an inductive dielectric analyzer based on a sensor comprising two concentric toroidal coils. In this work, we present an analytic derivation of the relationship between the impedance measured by the sensor and the complex dielectric permittivity of the sample. The obtained relationship was successfully employed to measure the dielectric permittivity and conductivity of various alcohols and aqueous salt solutions. (paper)

  3. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  4. Multiple capillary biochemical analyzer

    Science.gov (United States)

    Dovichi, N.J.; Zhang, J.Z.

    1995-08-08

    A multiple capillary analyzer allows detection of light from multiple capillaries with a reduced number of interfaces through which light must pass in detecting light emitted from a sample being analyzed, using a modified sheath flow cuvette. A linear or rectangular array of capillaries is introduced into a rectangular flow chamber. Sheath fluid draws individual sample streams through the cuvette. The capillaries are closely and evenly spaced and held by a transparent retainer in a fixed position in relation to an optical detection system. Collimated sample excitation radiation is applied simultaneously across the ends of the capillaries in the retainer. Light emitted from the excited sample is detected by the optical detection system. The retainer is provided by a transparent chamber having inward slanting end walls. The capillaries are wedged into the chamber. One sideways dimension of the chamber is equal to the diameter of the capillaries and one end to end dimension varies from, at the top of the chamber, slightly greater than the sum of the diameters of the capillaries to, at the bottom of the chamber, slightly smaller than the sum of the diameters of the capillaries. The optical system utilizes optic fibers to deliver light to individual photodetectors, one for each capillary tube. A filter or wavelength division demultiplexer may be used for isolating fluorescence at particular bands. 21 figs.

  5. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  6. Trace impurity analyzer

    International Nuclear Information System (INIS)

    Schneider, W.J.; Edwards, D. Jr.

    1979-01-01

    The desirability for long-term reliability of large scale helium refrigerator systems used on superconducting accelerator magnets has necessitated detection of impurities to levels of a few ppM. An analyzer that measures trace impurity levels of condensable contaminants in concentrations of less than a ppM in 15 atm of He is described. The instrument makes use of the desorption temperature at an indicated pressure of the various impurities to determine the type of contaminant. The pressure rise at that temperature yields a measure of the contaminant level of the impurity. A LN 2 cryogenic charcoal trap is also employed to measure air impurities (nitrogen and oxygen) to obtain the full range of contaminant possibilities. The results of this detector which will be in use on the research and development helium refrigerator of the ISABELLE First-Cell is described

  7. Analyzing Water's Optical Absorption

    Science.gov (United States)

    2002-01-01

    A cooperative agreement between World Precision Instruments (WPI), Inc., and Stennis Space Center has led the UltraPath(TM) device, which provides a more efficient method for analyzing the optical absorption of water samples at sea. UltraPath is a unique, high-performance absorbance spectrophotometer with user-selectable light path lengths. It is an ideal tool for any study requiring precise and highly sensitive spectroscopic determination of analytes, either in the laboratory or the field. As a low-cost, rugged, and portable system capable of high- sensitivity measurements in widely divergent waters, UltraPath will help scientists examine the role that coastal ocean environments play in the global carbon cycle. UltraPath(TM) is a trademark of World Precision Instruments, Inc. LWCC(TM) is a trademark of World Precision Instruments, Inc.

  8. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  9. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  10. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  11. Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  12. Analyzing Visibility Configurations.

    Science.gov (United States)

    Dachsbacher, C

    2011-04-01

    Many algorithms, such as level of detail rendering and occlusion culling methods, make decisions based on the degree of visibility of an object, but do not analyze the distribution, or structure, of the visible and occluded regions across surfaces. We present an efficient method to classify different visibility configurations and show how this can be used on top of existing methods based on visibility determination. We adapt co-occurrence matrices for visibility analysis and generalize them to operate on clusters of triangular surfaces instead of pixels. We employ machine learning techniques to reliably classify the thus extracted feature vectors. Our method allows perceptually motivated level of detail methods for real-time rendering applications by detecting configurations with expected visual masking. We exemplify the versatility of our method with an analysis of area light visibility configurations in ray tracing and an area-to-area visibility analysis suitable for hierarchical radiosity refinement. Initial results demonstrate the robustness, simplicity, and performance of our method in synthetic scenes, as well as real applications.

  13. Analyzing Pseudophosphatase Function.

    Science.gov (United States)

    Hinton, Shantá D

    2016-01-01

    Pseudophosphatases regulate signal transduction cascades, but their mechanisms of action remain enigmatic. Reflecting this mystery, the prototypical pseudophosphatase STYX (phospho-serine-threonine/tyrosine-binding protein) was named with allusion to the river of the dead in Greek mythology to emphasize that these molecules are "dead" phosphatases. Although proteins with STYX domains do not catalyze dephosphorylation, this in no way precludes their having other functions as integral elements of signaling networks. Thus, understanding their roles in signaling pathways may mark them as potential novel drug targets. This chapter outlines common strategies used to characterize the functions of pseudophosphatases, using as an example MK-STYX [mitogen-activated protein kinase (MAPK) phospho-serine-threonine/tyrosine binding], which has been linked to tumorigenesis, apoptosis, and neuronal differentiation. We start with the importance of "restoring" (when possible) phosphatase activity in a pseudophosphatase so that the active mutant may be used as a comparison control throughout immunoprecipitation and mass spectrometry analyses. To this end, we provide protocols for site-directed mutagenesis, mammalian cell transfection, co-immunoprecipitation, phosphatase activity assays, and immunoblotting that we have used to investigate MK-STYX and the active mutant MK-STYXactive. We also highlight the importance of utilizing RNA interference (RNAi) "knockdown" technology to determine a cellular phenotype in various cell lines. Therefore, we outline our protocols for introducing short hairpin RNA (shRNA) expression plasmids into mammalians cells and quantifying knockdown of gene expression with real-time quantitative PCR (qPCR). A combination of cellular, molecular, biochemical, and proteomic techniques has served as powerful tools in identifying novel functions of the pseudophosphatase MK-STYX. Likewise, the information provided here should be a helpful guide to elucidating the

  14. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  15. Some Aspects of Nonlinearity and Self-Organization In Biosystems on Examples of Localized Excitations in the DNA Molecule and Generalized Fisher–KPP Model

    Directory of Open Access Journals (Sweden)

    A. V. Shapovalov

    2018-02-01

    Full Text Available This review deals with ideas and approaches to nonlinear phenomena, based on different branches of physics and related to biological systems, that focus on how small impacts can significantly change the state of the system at large spatial scales. This problem is very extensive, and it cannot be fully resolved in this paper. Instead, some selected physical effects are briefly reviewed. We consider sine-Gordon solitons and nonlinear Schrodinger solitons in some models of DNA as examples of self-organization at the molecular level, as well as examine features of their formation and dynamics under the influence of external influences. In addition, the formation of patterns in the generalized Fisher–KPP model is viewed as a simple example of self-organization in a system with nonlocal interaction at the cellular level. Symmetries of model equations are employed to analyze the considered nonlinear phenomena. In this context the possible relations between phenomena considered and released activity effect, which is assessed differently in the literature, are discussed.

  16. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  17. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  18. Portable Programmable Multifunction Body Fluids Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both simple...

  19. Clustering biomass-based technologies towards zero emissions - a tool how the Earth's resources can be shifted back to sustainability

    International Nuclear Information System (INIS)

    Gravitis, J.; Pauli, G.

    2001-01-01

    The Zero Emissions Research Initiative (ZERI) was founded on the fundamental concept that, in order to achieve environmentally sustainable development, industries must maximize the use of available raw materials and utilize their own wastes and by-products to the fullest extent possible so as to eliminate all emissions into the air, water and soil. Research focuses on what are considered to be four central components of zero emissions biobased industries: (I) integrated biosystems, (II) materials separation technologies, (III) biorefinery, and (IV) zero emissions systems design. In this way, industries may be organized into clusters within one single system, or in interdependent sets of industries. (authors)

  20. Multichannel analyzer development in CAMAC

    International Nuclear Information System (INIS)

    Nagy, J.Z.; Zarandy, A.

    1988-01-01

    The data acquisition in TOKAMAK experiments some CAMAC modules have been developed. The modules are the following: 64 K analyzer memory, 32 K analyzer memory, 6-channel pulse peak analyzer memory which contains the 32 K analyzer memory and eight AD-converters

  1. Human Performance and Biosystems (Spring Review)

    Science.gov (United States)

    2014-03-01

    public release; distribution is unlimited Areas of Emphasis Biofilms/Nanowires – microbe communication, extracellular electron transfer, cyborg ...Artificial Photosynthesis • Algal oil generation • Biofilm, Nanowires, Cyborg Cell • tDCS • Biomarkers 5 Distribution A: Approved for public...release; distribution is unlimited Program Interactions BRI magnetic navigation Microbes/nanowires tDCS/ Cyborg cell Synthetic Biology

  2. Cancer: a profit-driven biosystem?

    Science.gov (United States)

    Deisboeck, Thomas S

    2008-08-01

    The argument is made that solid malignant tumors behave as profit-driven biological systems in that they expand their nutrient-uptaking surface to increase energetic revenue, at a comparably low metabolic cost. Within this conceptual framework, cancer cell migration is a critical mechanism as it maximizes systemic surface expansion while minimizing diffusion distance. Treating these tumor systems with adjuvant anti-proliferative regimen only should increase the energetic net gain of the viable cancer cells left behind, hence would facilitate tumor recurrence. Therapeutic attempts to better control tumor (re)growth should therefore aim primarily at containing its surface expansion, thus reducing its energetic revenue, or increasing its metabolic costs or better yet, both.

  3. Department of Agricultural and Biosystems Engineeri

    African Journals Online (AJOL)

    USER

    2015-01-03

    Jan 3, 2015 ... probability that an implement will complete a specific task under ... workers. Frequency of breakdown of equipment and high cost of purchasing new ... adjustment on tillage implement during ... Commutative Distribution.

  4. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Fujisawa, A.; Iguchi, H.; Nishizawa, A.; Kawasumi, Y.

    1996-11-01

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  5. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    Cheng, H.S.; Wulff, W.; Mallen, A.N.; Lekach, S.V.; Stritar, A.; Cerbone, R.J.

    1985-01-01

    Advanced technology for high-speed interactive nuclear power plant simulations is of great value for timely resolution of safety issues, for plant monitoring, and for computer-aided emergency responses to an accident. Presented is the methodology employed at BNL to develop a BWR plant analyzer capable of simulating severe plant transients at much faster than real-time process speeds. Five modeling principles are established and a criterion is given for selecting numerical procedures and efficient computers to achieve the very high simulation speeds. Typical results are shown to demonstrate the modeling fidelity of the BWR plant analyzer

  6. Development of pulse neutron coal analyzer

    International Nuclear Information System (INIS)

    Jing Shiwie; Gu Deshan; Qiao Shuang; Liu Yuren; Liu Linmao; Jing Shiwei

    2005-01-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14 MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented

  7. Evaluation of the Air Void Analyzer

    Science.gov (United States)

    2013-07-01

    concrete using image analysis: Petrography of cementitious materials. ASTM STP 1215. S.M. DeHayes and D. Stark, eds. Philadelphia, PA: American...Administration (FHWA). 2006. Priority, market -ready technologies and innovations: Air Void Analyzer. Washington D.C. PDF file. Germann Instruments (GI). 2011...tests and properties of concrete and concrete-making materials. STP 169D. West Conshohocken, PA: ASTM International. Magura, D.D. 1996. Air void

  8. Research on environmental bioecosensing technology using ecological information; Seitaikei joho ni yoru kankyo bio eco sensing gijutsu ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    The bioecosensing technology was studied which detects and identifies feeble signals generated by biosystem communication in wide biological environment. The following were reported as current notable environmental biosensing technologies: a quick measurement method of environmental contaminants using immunological measurement method, analysis method of ecological state of microorganism using DNA probes, observation of ecosystem by bioluminescent system, measurement method of environmental changes and contaminants using higher animals and plants, and detection method of chemical contaminants using chemotaxis of microorganism. As a result, the new bioecosensing/monitoring technology in molecular level was suggested for identifying comprehensive environmental changes which could not be measured by previous physical and chemical methods, as changes in ecosystem corresponding to environmental changes. As the wide area remote sensing technology of environmental ecological information, sensing technology on the earth, aircraft and satellite was also discussed. 247 refs., 55 figs., 17 tabs.

  9. Digital Multi Channel Analyzer Enhancement

    International Nuclear Information System (INIS)

    Gonen, E.; Marcus, E.; Wengrowicz, U.; Beck, A.; Nir, J.; Sheinfeld, M.; Broide, A.; Tirosh, D.

    2002-01-01

    A cement analyzing system based on radiation spectroscopy had been developed [1], using novel digital approach for real-time, high-throughput and low-cost Multi Channel Analyzer. The performance of the developed system had a severe problem: the resulted spectrum suffered from lack of smoothness, it was very noisy and full of spikes and surges, therefore it was impossible to use this spectrum for analyzing the cement substance. This paper describes the work carried out to improve the system performance

  10. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  11. Multichannel analyzer type CMA-3

    International Nuclear Information System (INIS)

    Czermak, A.; Jablonski, J.; Ostrowicz, A.

    1978-01-01

    Multichannel analyzer CMA-3 is designed for two-parametric analysis with operator controlled logical windows. It is implemented in CAMAC standard. A single crate contains all required modules and is controlled by the PDP-11/10 minicomputer. Configuration of CMA-3 is shown. CMA-3 is the next version of the multichannel analyzer described in report No 958/E-8. (author)

  12. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  13. A framework to analyze emissions implications of ...

    Science.gov (United States)

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  14. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  15. DEMorphy, German Language Morphological Analyzer

    OpenAIRE

    Altinok, Duygu

    2018-01-01

    DEMorphy is a morphological analyzer for German. It is built onto large, compactified lexicons from German Morphological Dictionary. A guesser based on German declension suffixed is also provided. For German, we provided a state-of-art morphological analyzer. DEMorphy is implemented in Python with ease of usability and accompanying documentation. The package is suitable for both academic and commercial purposes wit a permissive licence.

  16. A Categorization of Dynamic Analyzers

    Science.gov (United States)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input

  17. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  18. Using GIS technology to analyze and understand wet meadow ecosystems

    Science.gov (United States)

    Joy Rosen; Roy Jemison; David Pawelek; Daniel Neary

    1999-01-01

    A Cibola National Forest wet meadow restoration was implemented as part of the Forest Road 49 enhancement near Grants, New Mexico. An Arc/View 3.0 Geographic Information System (GIS) was used to track the recovery of this ecosystem. Layers on topography, hydrology, vegetation, soils and human alterations were compiled using a GPS and commonly available data....

  19. M-Learning and Technological Literacy: Analyzing Benefits for Apprenticeship

    Science.gov (United States)

    Cortés, Carlos Manuel Pacheco; Cortés, Adriana Margarita Pacheco

    2014-01-01

    The following study consists on comparative literature review conducted by several researchers and instructional designers; for a wide comprehension of Mobile-Learning (abbreviated "M-Learning") as an educational platform to provide "anytime-anywhere" access to interactions and resources on-line, and "Technological…

  20. Standard Analyzer of VHDL Applications for Next Generation Technology (SAVANT)

    National Research Council Canada - National Science Library

    Hirsch, Herbert

    1995-01-01

    ... (IF) for the exchange of VHDL encoded electronic data among CAD systems. This absence has severely constrained basic research environments, and has precipitated the current sub-optimal nature of CAD in VHDL tool development...

  1. Achievement report for fiscal 1997 on developing a silicon manufacturing process with reduced energy consumption. Investigation and research on analyzing practical application of a technology to manufacture solar cell silicon raw materials; 1997 nendo energy shiyo gorika silicon seizo process kaihatsu. Taiyo denchi silicon genryo seizo gijutsu no jitsuyoka kaiseki ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    This paper describes the achievement in fiscal 1997 of analyzing practical application of a technology to manufacture solar cell silicon raw materials. Silicon consumption for solar cells in fiscal 1997 has increased to 2000-ton level, and the supply has been very tight. For drastic improvement in the demand and supply situation, development of SOG-Si manufacturing technology and its early practical application are desired. The development of the NEDO mass-production technology using melting and refining has completed constructing the process facilities in fiscal 1998, and will enter the stage of operational research. However, insufficiency in the basic data about behavior of impurities is inhibiting the development. In the substrate manufacturing technology, discussions have shown progress on use of diversifying silicons outside the standard by using the electromagnetic casting process. For slicing and processing the substrates, development of a high-performance slicing equipment and automatic rough rinsing machine is under way. Properties required on silicon raw materials vary considerably widely because of difference in cell making systems and conditions, which is attributable to unknown impurity behavior. When 1GW production is assumed, the cell module manufacturing cost is calculated as 137 yen/W, for which low-cost mass production for its realization, slicing productivity enhancement, and cost reduction are required. The paper also describes site surveys in overseas countries. (NEDO)

  2. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  3. Device for analyzing a solution

    International Nuclear Information System (INIS)

    Marchand, Joseph.

    1978-01-01

    The device enables a solution containing an antigen to be analyzed by the radio-immunology technique without coming up against the problems of antigen-antibody complex and free antigen separation. This device, for analyzing a solution containing a biological compound capable of reacting with an antagonistic compound specific of the biological compound, features a tube closed at its bottom end and a component set and immobilized in the bottom of the tube so as to leave a capacity between the bottom of the tube and its lower end. The component has a large developed surface and is so shaped that it allows the solution to be analyzed to have access to the bottom of the tube; it is made of a material having some elastic deformation and able to take up a given quantity of the biological compound or of the antagonistic compound specific of the biological compound [fr

  4. Multichannel analyzer embedded in FPGA

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Bravo M, I.

    2017-10-01

    Ionizing radiation has different applications, so it is a very significant and useful tool, which in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, it cannot be perceived by any of the senses of the human being, so that in order to know the presence of it, radiation detectors and additional devices are required to quantify and classify it. A multichannel analyzer is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The objective of the work was to design and implement a multichannel analyzer and its associated virtual instrument, for nuclear spectrometry. The components of the multichannel analyzer were created in VHDL hardware description language and packaged in the Xilinx Vivado design suite, making use of resources such as the ARM processing core that the System on Chip Zynq contains and the virtual instrument was developed on the LabView programming graphics platform. The first phase was to design the hardware architecture to be embedded in the FPGA and for the internal control of the multichannel analyzer the application was generated for the ARM processor in C language. For the second phase, the virtual instrument was developed for the management, control and visualization of the results. The data obtained as a result of the development of the system were observed graphically in a histogram showing the spectrum measured. The design of the multichannel analyzer embedded in FPGA was tested with two different radiation detection systems (hyper-pure germanium and scintillation) which allowed determining that the spectra obtained are similar in comparison with the commercial multichannel analyzers. (Author)

  5. Analyzing petabytes of data with Hadoop

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  6. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  7. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  8. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  9. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  10. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    Matsuyama, Yuji; Orii, Shigeo; Ota, Toshiro; Kume, Etsuo; Aikawa, Hiroshi.

    1997-03-01

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  11. Methods of analyzing crude oil

    Science.gov (United States)

    Cooks, Robert Graham; Jjunju, Fred Paul Mark; Li, Anyin; Rogan, Iman S.

    2017-08-15

    The invention generally relates to methods of analyzing crude oil. In certain embodiments, methods of the invention involve obtaining a crude oil sample, and subjecting the crude oil sample to mass spectrometry analysis. In certain embodiments, the method is performed without any sample pre-purification steps.

  12. Therapy Talk: Analyzing Therapeutic Discourse

    Science.gov (United States)

    Leahy, Margaret M.

    2004-01-01

    Therapeutic discourse is the talk-in-interaction that represents the social practice between clinician and client. This article invites speech-language pathologists to apply their knowledge of language to analyzing therapy talk and to learn how talking practices shape clinical roles and identities. A range of qualitative research approaches,…

  13. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  14. Proton-beam energy analyzer

    International Nuclear Information System (INIS)

    Belan, V.N.; Bolotin, L.I.; Kiselev, V.A.; Linnik, A.F.; Uskov, V.V.

    1989-01-01

    The authors describe a magnetic analyzer for measurement of proton-beam energy in the range from 100 keV to 25 MeV. The beam is deflected in a uniform transverse magnetic field and is registered by photographing a scintillation screen. The energy spectrum of the beam is constructed by microphotometry of the photographic film

  15. The Photo-Pneumatic CO2 Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  16. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  17. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  18. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  19. New approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility

  20. Analyzing the Facebook Friendship Graph

    OpenAIRE

    Catanese, Salvatore; De Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo

    2010-01-01

    Online Social Networks (OSN) during last years acquired a huge and increasing popularity as one of the most important emerging Web phenomena, deeply modifying the behavior of users and contributing to build a solid substrate of connections and relationships among people using the Web. In this preliminary work paper, our purpose is to analyze Facebook, considering a significant sample of data reflecting relationships among subscribed users. Our goal is to extract, from this platform, relevant ...

  1. Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, Stephen R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-05-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. Brookhaven National Laboratory (BNL) has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  2. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  3. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  4. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  5. Analyzing the User Behavior toward Electronic Commerce Stimuli

    OpenAIRE

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this res...

  6. Analyzing the user behavior towards Electronic Commerce stimuli

    OpenAIRE

    Carlota Lorenzo-Romero; María-del-Carmen Alarcón-del-Amo

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus) versus nonverbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this resea...

  7. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.; Carlson, R.L.

    1987-01-01

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  8. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  9. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  10. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  11. Charge Analyzer Responsive Local Oscillations

    Science.gov (United States)

    Krause, Linda Habash; Thornton, Gary

    2015-01-01

    The first transatlantic radio transmission, demonstrated by Marconi in December of 1901, revealed the essential role of the ionosphere for radio communications. This ionized layer of the upper atmosphere controls the amount of radio power transmitted through, reflected off of, and absorbed by the atmospheric medium. Low-frequency radio signals can propagate long distances around the globe via repeated reflections off of the ionosphere and the Earth's surface. Higher frequency radio signals can punch through the ionosphere to be received at orbiting satellites. However, any turbulence in the ionosphere can distort these signals, compromising the performance or even availability of space-based communication and navigations systems. The physics associated with this distortion effect is analogous to the situation when underwater images are distorted by convecting air bubbles. In fact, these ionospheric features are often called 'plasma bubbles' since they exhibit some of the similar behavior as underwater air bubbles. These events, instigated by solar and geomagnetic storms, can cause communication and navigation outages that last for hours. To help understand and predict these outages, a world-wide community of space scientists and technologists are devoted to researching this topic. One aspect of this research is to develop instruments capable of measuring the ionospheric plasma bubbles. Figure 1 shows a photo of the Charge Analyzer Responsive to Local Oscillations (CARLO), a new instrument under development at NASA Marshall Space Flight Center (MSFC). It is a frequency-domain ion spectrum analyzer designed to measure the distributions of ionospheric turbulence from 1 Hz to 10 kHz (i.e., spatial scales from a few kilometers down to a few centimeters). This frequency range is important since it focuses on turbulence scales that affect VHF/UHF satellite communications, GPS systems, and over-the-horizon radar systems. CARLO is based on the flight-proven Plasma Local

  12. Radiation energy detector and analyzer

    International Nuclear Information System (INIS)

    Roberts, T.G.

    1981-01-01

    A radiation detector array and a method for measuring the spectral content of radiation. The radiation sensor or detector is an array or stack of thin solid-electrolyte batteries. The batteries, arranged in a stack, may be composed of independent battery cells or may be arranged so that adjacent cells share a common terminal surface. This common surface is possible since the polarity of the batteries with respect to an adjacent battery is unrestricted, allowing a reduction in component parts of the assembly and reducing the overall stack length. Additionally, a test jig or chamber for allowing rapid measurement of the voltage across each battery is disclosed. A multichannel recorder and display may be used to indicate the voltage gradient change across the cells, or a small computer may be used for rapidly converting these voltage readings to a graph of radiation intensity versus wavelength or energy. The behavior of the batteries when used as a radiation detector and analyzer are such that the voltage measurements can be made at leisure after the detector array has been exposed to the radiation, and it is not necessary to make rapid measurements as is now done

  13. Nuclear plant analyzer desktop workstation

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1990-01-01

    In 1983 the U.S. Nuclear Regulatory Commission (USNRC) commissioned the Idaho National Engineering Laboratory (INEL) to develop a Nuclear Plant Analyzer (NPA). The NPA was envisioned as a graphical aid to assist reactor safety analysts in comprehending the results of thermal-hydraulic code calculations. The development was to proceed in three distinct phases culminating in a desktop reactor safety workstation. The desktop NPA is now complete. The desktop NPA is a microcomputer based reactor transient simulation, visualization and analysis tool developed at INEL to assist an analyst in evaluating the transient behavior of nuclear power plants by means of graphic displays. The NPA desktop workstation integrates advanced reactor simulation codes with online computer graphics allowing reactor plant transient simulation and graphical presentation of results. The graphics software, written exclusively in ANSI standard C and FORTRAN 77 and implemented over the UNIX/X-windows operating environment, is modular and is designed to interface to the NRC's suite of advanced thermal-hydraulic codes to the extent allowed by that code. Currently, full, interactive, desktop NPA capabilities are realized only with RELAP5

  14. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Development in fiscal 1999 of technologies to put photovoltaic power generation systems into practical use. Volume 1. Development of thin film solar cell manufacturing technologies (Development of technologies to manufacture low-cost large-area modules and survey and research on analyzing how to put products into practical use); 1999 nendo taiyoko hatsuden system jitsuyoka gijutsu kaihatsu seika hokokusho. Usumaku taiyo denchi no seizo gijutsu kaihatsu (tei cost daimenseki module seizo gijutsu kaihatsu (jitsuyoka kaiseki ni kansuru chosa kenkyu 1))

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    With an objective to assist research and development to put thin film solar cells for power use into practical use and a research to put thin film solar cell manufacturing technologies into practical use, survey and research have been performed on trends in the technologies inside and outside the country. Characteristic points in thin film solar cells during the current fiscal year include: expansion of production scale of amorphous silicon solar cells, rapid progress in poly-crystalline silicon thin film solar cell technologies, and enhancement of performance in large-area modules in the a-Si, CIGS, and CdTe systems. In the trends in research and development of amorphous systems, expectation is heightening on elucidation of optical deterioration phenomena, and establishment of suppression technologies thereof. Although the highest efficiency was not renewed in thin film solar cells of small areas, progress was seen in the post-stabilization efficiency in large-area modules. A thin film solar cell manufacturing plant having an annual production capacity of 20 MW was put into operation in October in Japan. Micro (poly) crystalline silicon based solar cells have high possibility of being compatible in cost reduction and performance improvement, and energetic researches are being carried out on them in recent years as the most promising candidate of the next generation solar cells. (NEDO)

  16. Analyzing the development of Indonesia shrimp industry

    Science.gov (United States)

    Wati, L. A.

    2018-04-01

    This research aimed to analyze the development of shrimp industry in Indonesia. Porter’s Diamond Theory was used for analysis. The Porter’s Diamond theory is one of framework for industry analysis and business strategy development. The Porter’s Diamond theory has five forces that determine the competitive intensity in an industry, namely (1) the threat of substitute products, (2) the threat of competition, (3) the threat of new entrants, (4) bargaining power of suppliers, and (5) bargaining power of consumers. The development of Indonesian shrimp industry pretty good, explained by Porter Diamond Theory analysis. Analysis of Porter Diamond Theory through four main components namely factor conditions; demand condition; related and supporting industries; and firm strategy, structure and rivalry coupled with a two-component supporting (regulatory the government and the factor of chance). Based on the result of this research show that two-component supporting (regulatory the government and the factor of chance) have positive. Related and supporting industries have negative, firm and structure strategy have negative, rivalry has positive, factor condition have positive (except science and technology resources).

  17. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  18. Analyzing Innovation Systems (Burkina Faso) | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Under the supervision of the national centre for scientific and technological research (CNRST), the forum on scientific research and technological innovation (FRSIT) will identify the principal players in the national system of ... Journal articles.

  19. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  20. Development of remote controlled electron probe micro analyzer with crystal orientation analyzer

    International Nuclear Information System (INIS)

    Honda, Junichi; Matsui, Hiroki; Harada, Akio; Obata, Hiroki; Tomita, Takeshi

    2012-07-01

    The advanced utilization of Light Water Reactor (LWR) fuel is progressed in Japan to save the power generating cost and the volume of nuclear wastes. The electric power companies have continued the approach to the burnup extension and to rise up the thermal power increase of the commercial fuel. The government should be accumulating the detailed information on the newest technologies to make the regulations and guidelines for the safety of the advanced nuclear fuels. The remote controlled Electron Probe Micro Analyzer (EPMA) attached with crystal orientation analyzer has been developed in Japan Atomic Energy Agency (JAEA) to study the fuel behavior of the high burnup fuels under the accident condition. The effects of the cladding microstructure on the fuel behavior will be evaluated more conveniently and quantitatively by this EPMA. The commercial model of EPMA has been modified to have the performance of airtight and earthquake resistant in compliance with the safety regulation by the government for handling the high radioactive elements. This paper describes the specifications of EPMA which were specialised for post irradiation examination and the test results of the cold mock-up to confirm their performances and reliabilities. (author)

  1. Methyl-Analyzer--whole genome DNA methylation profiling.

    Science.gov (United States)

    Xin, Yurong; Ge, Yongchao; Haghighi, Fatemeh G

    2011-08-15

    Methyl-Analyzer is a python package that analyzes genome-wide DNA methylation data produced by the Methyl-MAPS (methylation mapping analysis by paired-end sequencing) method. Methyl-MAPS is an enzymatic-based method that uses both methylation-sensitive and -dependent enzymes covering >80% of CpG dinucleotides within mammalian genomes. It combines enzymatic-based approaches with high-throughput next-generation sequencing technology to provide whole genome DNA methylation profiles. Methyl-Analyzer processes and integrates sequencing reads from methylated and unmethylated compartments and estimates CpG methylation probabilities at single base resolution. Methyl-Analyzer is available at http://github.com/epigenomics/methylmaps. Sample dataset is available for download at http://epigenomicspub.columbia.edu/methylanalyzer_data.html. fgh3@columbia.edu Supplementary data are available at Bioinformatics online.

  2. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  3. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  4. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  5. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  6. Social Media: A Phenomenon to be Analyzed

    Directory of Open Access Journals (Sweden)

    danah boyd

    2015-04-01

    Full Text Available The phenomenon of “social media” has more to do with its cultural positioning than its technological affordances. Rooted in the broader “Web 2.0” landscape, social media helped engineers, entrepreneurs, and everyday people reimagine the role that technology could play in information dissemination, community development, and communication. While the technologies invoked by the phrase social media have a long history, what unfolded in the 2000s reconfigured socio-technical practices in significant ways. Reflecting on the brief history of social media, this essay argues for the need to better understand this phenomenon.

  7. Mediamorphosis: Analyzing the Convergence of Digital Media ...

    African Journals Online (AJOL)

    User

    2011-04-19

    Apr 19, 2011 ... Mass Communication and media technologies started when Gutenberg ... to newspaper, telephone system, broadcasting, film as well as the internet”, .... entertainment and education even in societies that have been seriously.

  8. Time-delay analyzer with continuous discretization

    International Nuclear Information System (INIS)

    Bayatyan, G.L.; Darbinyan, K.T.; Mkrtchyan, K.K.; Stepanyan, S.S.

    1988-01-01

    A time-delay analyzer is described which when triggered by a start pulse of adjustable duration performs continuous discretization of the analyzed signal within nearly 22 ns time intervals, the recording in a memory unit with following slow read-out of the information to the computer and its processing. The time-delay analyzer consists of four CAMAC-VECTOR systems of unit width. With its help one can separate comparatively short, small-amplitude rare signals against the background of quasistationary noise processes. 4 refs.; 3 figs

  9. Environmental technology foresight : New horizons for technology management

    NARCIS (Netherlands)

    Den Hond, Frank; Groenewegen, Peter

    1996-01-01

    Decision-making in corporate technology management and government technology policy is increasingly influenced by the environmental impact of technologies. Technology foresight (TF) and environmental impact assessment (EIA) are analyzed with regard to the roles they can play in developing long-term

  10. On-Demand Urine Analyzer, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  11. Low Gravity Drug Stability Analyzer, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  12. New high voltage parallel plate analyzer

    International Nuclear Information System (INIS)

    Hamada, Y.; Kawasumi, Y.; Masai, K.; Iguchi, H.; Fujisawa, A.; Abe, Y.

    1992-01-01

    A new modification on the parallel plate analyzer for 500 keV heavy ions to eliminate the effect of the intense UV and visible radiations, is successfully conducted. Its principle and results are discussed. (author)

  13. Analyzing the economic impacts of transportation projects.

    Science.gov (United States)

    2013-09-01

    The main goal of the study is to explore methods, approaches and : analytical software tools for analyzing economic activity that results from largescale : transportation investments in Connecticut. The primary conclusion is that the : transportation...

  14. Health technology

    International Nuclear Information System (INIS)

    Nicolas, Delphine; Dangleant, Caroline; Ganier, Aude; Kaczmarek, Delphine

    2008-01-01

    The CEA is an organization with a primarily technological focus, and one of the key areas in which it carries out research is Health Technology. This field of research was recognized and approved by the French Atomic Energy Committee on July 20, 2004. The expectations of both the public and health care professionals relate to demands for the highest standards of health care, at minimum risk. This implies a need to diagnose illness and disease as accurately and as at early a stage as possible, to target surgery precisely to deal only with damaged organs or tissues, to minimize the risk of side effects, allergies and hospital-acquired infections, to follow-up and, as far as possible, tailor the health delivery system to each individual's needs and his or her lifestyle. The health care sector is subject to rapid changes and embraces a vast range of scientific fields. It now requires technological developments that will serve to gather increasing quantities of useful information, analyze and integrate it to obtain a full understanding of highly complex processes and to be able to treat the human body as un-invasively as possible. All the technologies developed require assessment, especially in the hospital environment. (authors)

  15. Digital dynamic amplitude-frequency spectra analyzer

    International Nuclear Information System (INIS)

    Kalinnikov, V.A.; )

    2006-01-01

    The spectra analyzer is intended for the dynamic spectral analysis of signals physical installations and noise filtering. The recurrence Fourier transformation algorithm is used in the digital dynamic analyzer. It is realized on the basis of the fast logic FPGA matrix and the special signal ADSP microprocessor. The discretization frequency is 2 kHz-10 MHz. The number of calculated spectral coefficients is not less 512. The functional fast-action is 20 ns [ru

  16. FST Based Morphological Analyzer for Hindi Language

    OpenAIRE

    Deepak Kumar; Manjeet Singh; Seema Shukla

    2012-01-01

    Hindi being a highly inflectional language, FST (Finite State Transducer) based approach is most efficient for developing a morphological analyzer for this language. The work presented in this paper uses the SFST (Stuttgart Finite State Transducer) tool for generating the FST. A lexicon of root words is created. Rules are then added for generating inflectional and derivational words from these root words. The Morph Analyzer developed was used in a Part Of Speech (POS) Tagger based on Stanford...

  17. An Axiomatic, Unified Representation of Biosystems and Quantum Dynamics

    CERN Document Server

    Baianu, I

    2004-01-01

    An axiomatic representation of system dynamics is introduced in terms of categories, functors, organismal supercategories, limits and colimits of diagrams. Specific examples are considered in Complex Systems Biology, such as ribosome biogenesis and Hormonal Control in human subjects. "Fuzzy" Relational Structures are also proposed for flexible representations of biological system dynamics and organization.

  18. A biosystem for removal of metal ions from water

    Energy Technology Data Exchange (ETDEWEB)

    Kilbane, J.J. II.

    1990-01-01

    The presence of heavy metal ions in ground and surface waters constitutes a potential health risk and is an environmental concern. Moreover, processes for the recovery of valuable metal ions are of interest. Bioaccumulation or biosorption is not only a factor in assessing the environmental risk posed by metal ions; it can also be used as a means of decontamination. A biological system for the removal and recovery of metal ions from contaminated water is reported here. Exopolysaccharide-producing microorganisms, including a methanotrophic culture, are demonstrated to have superior metal binding ability, compared with other microbial cultures. This paper describes a biosorption process in which dried biomass obtained from exopolysaccharide-producing microorganisms is encapsulated in porous plastic beads and is used for metal ion binding and recovery. 22 refs., 13 figs.

  19. Reactors based on CANDU technology

    International Nuclear Information System (INIS)

    Bjegun, S.V.; Shirokov, S.V.

    2012-01-01

    The paper analyzes the use CANDU technology in world nuclear energy. Advantages and disadvantages in implementation of this technology are considered in terms of economic and technical aspects. Technological issues related to the use of CANDU reactors and nuclear safety issues are outlined. Risks from implementation of this reactor technology in nuclear energy of Ukraine are determined

  20. Cancer Technology - Cancer Currents Blog

    Science.gov (United States)

    Blog posts on technologies that affect cancer research and care—including new technologies for detecting cancer, testing treatments, storing/analyzing data, and improving patient care—from NCI Cancer Currents.

  1. A new automatic analyzer for uranium determination

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyan; Zhang Lan

    1992-08-01

    An intellectual automatic analyzer for uranium based on the principle of flow injection analysis (FIA) has been developed. It can directly determine the uranium solution in range of 0.02 to 500 mg/L without any pre-process. A chromatographic column with extractant, in which the trace uranium is concentrated and separated, has special ability to enrich uranium, is connected to the manifold of the analyzer. The analyzer is suited for trace uranium determination in varies samples. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) is used as color reagent. Uranium is determined in aqueous solution by adding cation surfactant, cetyl-pyridinium bromide (PCB). The rate of analysis is 30 to 90 samples per hour. The relative standard deviation of determination is 1% ∼ 2%. The analyzer has been used in factories and laboratory, and the results are satisfied. The determination range can easily be changed by using a multi-function auto-injection valve that changes the injection volume of the sample and channels. So, it could adopt varies FIA operation modes to meet the needs of FIA determination for other substance. The analyzer has universal functions

  2. Energy and technology review

    International Nuclear Information System (INIS)

    1981-10-01

    Research is described in three areas, high-technology design of unconventional, nonnuclear weapons, a model for analyzing special nuclear materials safeguards decisions, and a nuclear weapons accident exercise (NUWAX-81)

  3. Innovative Technologies in Transportation

    Science.gov (United States)

    2004-12-01

    An historical overview of the transportation infrastructure of the United States and Texas is provided. Data for trends in transportation is analyzed and projections for the future are postulated. A survey of current technologies in transportation is...

  4. Information Technology Industry 2004

    National Research Council Canada - National Science Library

    Altieri, Richard; Buccheit, Nathan; Burke, Kyle; Dillard, Norvel; Dolan, Patrick; Edwards, Gregory; Elins, Daniel; Gaines, Leonard; Goodwin, Steven; Lawrence, Michael

    2004-01-01

    .... This study will define the Information Technology Industry, give an overview of current domestic and international conditions, and then analyze the state of national network security and challenges faced by the U.S. government and U.S...

  5. Dashboard for Analyzing Ubiquitous Learning Log

    Science.gov (United States)

    Lkhagvasuren, Erdenesaikhan; Matsuura, Kenji; Mouri, Kousuke; Ogata, Hiroaki

    2016-01-01

    Mobile and ubiquitous technologies have been applied to a wide range of learning fields such as science, social science, history and language learning. Many researchers have been investigating the development of ubiquitous learning environments; nevertheless, to date, there have not been enough research works related to the reflection, analysis…

  6. Analyzing the mediated voice - a datasession

    DEFF Research Database (Denmark)

    Lawaetz, Anna

    Broadcasted voices are technologically manipulated. In order to achieve a certain autencity or sound of “reality” paradoxically the voices are filtered and trained in order to reach the listeners. This “mis-en-scene” is important knowledge when it comes to the development of a consistent method o...... of analysis of the mediated voice...

  7. Analyzing animal movement patterns using potential functions

    Science.gov (United States)

    H. K. Preisler; A. A. Ager; M. J. Wisdom

    2013-01-01

    The advent of GPS technology has made it possible to study human-wildlife interactions on large landscapes and quantify behavioral responses to recreation and other anthropogenic disturbances at increasingly fine scales. Of particular interest are the potential impacts on habitat use patterns, energetics, and cascading impacts on fecundity and other life history traits...

  8. Scintiscans data analyzer model AS-10

    International Nuclear Information System (INIS)

    Malesa, J.; Wierzbicki, W.

    1975-01-01

    The principle of work and construction elements of the device made up for scintiscans data analyzation by ''square root scaling'' is presented. The device is equipped with cassette tape recorder type MK-125, made in Poland serving like scintiscans data bank, and with scintiscans data analyzation three programs. The cassette of two types, C-60 and C-90, is applied with working time of 2 x 30 min. and 2 x 45 min. respectivly. Results of scintiscans data analysation are printed by electric typewriter at figures in form of digital scintigram. (author)

  9. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    . To facilitate occupational safety and health there is a need to develop instruments to monitor and analyze nanoparticles in the industry, research and urban environments. The aim of this Ph.D. project was to develop new sensors that can analyze engineered nanoparticles. Two sensors were studied: (i......) a miniaturized toxicity sensor based on electrochemistry and (ii) a photothermal spectrometer based on tensile-stressed mechanical resonators (string resonators). Miniaturization of toxicity sensor targeting engineered nanoparticles was explored. This concept was based on the results of the biodurability test...

  10. Analyzing Web Behavior in Indoor Retail Spaces

    OpenAIRE

    Ren, Yongli; Tomko, Martin; Salim, Flora; Ong, Kevin; Sanderson, Mark

    2015-01-01

    We analyze 18 million rows of Wi-Fi access logs collected over a one year period from over 120,000 anonymized users at an inner-city shopping mall. The anonymized dataset gathered from an opt-in system provides users' approximate physical location, as well as Web browsing and some search history. Such data provides a unique opportunity to analyze the interaction between people's behavior in physical retail spaces and their Web behavior, serving as a proxy to their information needs. We find: ...

  11. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  12. X-ray fluorescence analyzer arrangement

    International Nuclear Information System (INIS)

    Vatai, Endre; Ando, Laszlo; Gal, Janos.

    1981-01-01

    An x-ray fluorescence analyzer for the quantitative determination of one or more elements of complex samples is reported. The novelties of the invention are the excitation of the samples by x-rays or γ-radiation, the application of a balanced filter pair as energy selector, and the measurement of the current or ion charge of ionization detectors used as sensors. Due to the increased sensitivity and accuracy, the novel design can extend the application fields of x-ray fluorescence analyzers. (A.L.)

  13. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    Marcus, E.; Elhanani, I.; Nir, J.; Ellenbogen, M.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  14. Analyzing the User Behavior toward Electronic Commerce Stimuli

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human behavior (i.e., users’ internal states -affective, cognitive, and satisfaction- and behavioral responses – approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 (“free” versus “hierarchical” navigational structure) × 2 (“on” versus “off” music) × 2 (“moving” versus “static” images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior. PMID:27965549

  15. Analyzing Big Data with the Hybrid Interval Regression Methods

    Directory of Open Access Journals (Sweden)

    Chia-Hui Huang

    2014-01-01

    Full Text Available Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM to analyze big data. Recently, the smooth support vector machine (SSVM was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.

  16. [Integrated Development of Full-automatic Fluorescence Analyzer].

    Science.gov (United States)

    Zhang, Mei; Lin, Zhibo; Yuan, Peng; Yao, Zhifeng; Hu, Yueming

    2015-10-01

    In view of the fact that medical inspection equipment sold in the domestic market is mainly imported from abroad and very expensive, we developed a full-automatic fluorescence analyzer in our center, presented in this paper. The present paper introduces the hardware architecture design of FPGA/DSP motion controlling card+PC+ STM32 embedded micro processing unit, software system based on C# multi thread, design and implementation of double-unit communication in detail. By simplifying the hardware structure, selecting hardware legitimately and adopting control system software to object-oriented technology, we have improved the precision and velocity of the control system significantly. Finally, the performance test showed that the control system could meet the needs of automated fluorescence analyzer on the functionality, performance and cost.

  17. ANALYZER OF QUANTITY AND QUALITY OF THE ELECTRIC POWER

    Directory of Open Access Journals (Sweden)

    A. I. Semilyak

    2013-01-01

    Full Text Available One of the activities of the research center for “Energy Saving Technologies and Smart Metering in Electrical Power Engineering" is research work on the use of electronic devices and systems of intelligent power distribution, produced by Analog Devices and equipped with the accurate energy consumption measurement feature. The article focuses on the development of the analyzer of quantity and quality of electric energy.The main part of the analyzer is a metering IC by Analog Devices ADE7878, designed for use in commercial and industrial smart electricity meters. Such counters measure the amount of consumed or produced electric energy with high accuracy and have the means of remote meter reading.

  18. What Hold us Together? Analyzing Biotech Field Formation

    Directory of Open Access Journals (Sweden)

    Jackeline Amantino de Andrade

    2011-03-01

    Full Text Available This article proposes to analyze the formation of biotechnological field bringing actor-network theory’s lens as contribution. Based on conclusions of studies developed by Walter Powell and colleagues it was held a research to analyze the diversity of institutional relations that are active by hemophilia therapies, the principle of generalized symmetry adopted for actor-network theory is highlight to identify how socio-technical associations are assembled. Besides the interorganizational relations, research’s findings indicate the scientific and technological contents have a significant mediating role to create and sustain those connections of knowledge. So, it is emphasized the need of a boarder theoretical discussion to enlarge explanations about the dynamics of organizational fields as well as innovation processes.

  19. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C.S.; Hawk, J.A.

    1995-07-25

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence. 9 figs.

  20. SINDA, Systems Improved Numerical Differencing Analyzer

    Science.gov (United States)

    Fink, L. C.; Pan, H. M. Y.; Ishimoto, T.

    1972-01-01

    Computer program has been written to analyze group of 100-node areas and then provide for summation of any number of 100-node areas to obtain temperature profile. SINDA program options offer user variety of methods for solution of thermal analog modes presented in network format.

  1. Analyzing the Acoustic Beat with Mobile Devices

    Science.gov (United States)

    Kuhn, Jochen; Vogt, Patrik; Hirth, Michael

    2014-01-01

    In this column, we have previously presented various examples of how physical relationships can be examined by analyzing acoustic signals using smartphones or tablet PCs. In this example, we will be exploring the acoustic phenomenon of small beats, which is produced by the overlapping of two tones with a low difference in frequency ?f. The…

  2. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  3. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  4. Environmental applications of the centrifugal fast analyzer

    International Nuclear Information System (INIS)

    Goldstein, G.; Strain, J.E.; Bowling, J.L.

    1975-12-01

    The centrifugal fast analyzer (GeMSAEC Fast Analyzer) was applied to the analysis of pollutants in air and water. Since data acquisition and processing are computer controlled, considerable effort went into devising appropriate software. A modified version of the standard FOCAL interpreter was developed which includes special machine language functions for data timing, acquisition, and storage, and also permits chaining together of programs stored on a disk. Programs were written and experimental procedures developed to implement spectrophotometric, turbidimetric, kinetic (including initial-rate, fixed-time, and variable-time techniques), and chemiluminescence methods of analysis. Analytical methods were developed for the following elements and compounds: SO 2 , O 3 , Ca, Cr, Cu, Fe, Mg, Se(IV), Zn, Cl - , I - , NO 2 - , PO 4 -3 , S -2 , and SO 4 -2 . In many cases, standard methods could be adapted to the centrifugal analyzer, in others new methods were employed. In general, analyses performed with the centrifugal fast analyzer were faster, more precise, and more accurate than with conventional instrumentation

  5. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime

  6. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  7. Analyzing Broadband Divide in the Farming Sector

    DEFF Research Database (Denmark)

    Jensen, Michael; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup

    2013-01-01

    , upstream and downstream connection. The main constraint is that farms are naturally located in rural areas where the required access broadband data rates are not available. This paper studies the broadband divide in relation to the Danish agricultural sector. Results show how there is an important......Agriculture industry has been evolving for centuries. Currently, the technological development of Internet oriented farming tools allows to increase the productivity and efficiency of this sector. Many of the already available tools and applications require high bandwidth in both directions...... difference between the broadband availability for farms and the rest of the households/buildings the country. This divide may be slowing down the potential technological development of the farming industry, in order to keep their competitiveness in the market. Therefore, broadband development in rural areas...

  8. Technology Policy and Employment.

    Science.gov (United States)

    Williams, Bruce

    1983-01-01

    Current social and economic problems in the United Kingdom are placed in the context of long-term trends in labor economics and the impact of new technology. The relationship of technological change and economic recovery is analyzed. Policy implications and the university's role are discussed. (MSE)

  9. Analyzing Music Services Positioning Through Qualitative Research

    OpenAIRE

    Cuadrado, Manuel; Miquel, María José; Montoro, Juan D.

    2015-01-01

    Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. ...

  10. BLOCKBENCH: A Framework for Analyzing Private Blockchains

    OpenAIRE

    Dinh, Tien Tuan Anh; Wang, Ji; Chen, Gang; Liu, Rui; Ooi, Beng Chin; Tan, Kian-Lee

    2017-01-01

    Blockchain technologies are taking the world by storm. Public blockchains, such as Bitcoin and Ethereum, enable secure peer-to-peer applications like crypto-currency or smart contracts. Their security and performance are well studied. This paper concerns recent private blockchain systems designed with stronger security (trust) assumption and performance requirement. These systems target and aim to disrupt applications which have so far been implemented on top of database systems, for example ...

  11. Generating and analyzing synthetic finger vein images

    OpenAIRE

    Hillerström, Fieke; Kumar, Ajay; Veldhuis, Raymond N.J.

    2014-01-01

    Abstract: The finger-vein biometric offers higher degree of security, personal privacy and strong anti-spoofing capabilities than most other biometric modalities employed today. Emerging privacy concerns with the database acquisition and lack of availability of large scale finger-vein database have posed challenges in exploring this technology for large scale applications. This paper details the first such attempt to synthesize finger-vein images and presents analysis of synthesized images fo...

  12. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  13. Laser Technology.

    Science.gov (United States)

    Gauger, Robert

    1993-01-01

    Describes lasers and indicates that learning about laser technology and creating laser technology activities are among the teacher enhancement processes needed to strengthen technology education. (JOW)

  14. Real time speech formant analyzer and display

    Science.gov (United States)

    Holland, George E.; Struve, Walter S.; Homer, John F.

    1987-01-01

    A speech analyzer for interpretation of sound includes a sound input which converts the sound into a signal representing the sound. The signal is passed through a plurality of frequency pass filters to derive a plurality of frequency formants. These formants are converted to voltage signals by frequency-to-voltage converters and then are prepared for visual display in continuous real time. Parameters from the inputted sound are also derived and displayed. The display may then be interpreted by the user. The preferred embodiment includes a microprocessor which is interfaced with a television set for displaying of the sound formants. The microprocessor software enables the sound analyzer to present a variety of display modes for interpretive and therapeutic used by the user.

  15. Analyzing public health policy: three approaches.

    Science.gov (United States)

    Coveney, John

    2010-07-01

    Policy is an important feature of public and private organizations. Within the field of health as a policy arena, public health has emerged in which policy is vital to decision making and the deployment of resources. Public health practitioners and students need to be able to analyze public health policy, yet many feel daunted by the subject's complexity. This article discusses three approaches that simplify policy analysis: Bacchi's "What's the problem?" approach examines the way that policy represents problems. Colebatch's governmentality approach provides a way of analyzing the implementation of policy. Bridgman and Davis's policy cycle allows for an appraisal of public policy development. Each approach provides an analytical framework from which to rigorously study policy. Practitioners and students of public health gain much in engaging with the politicized nature of policy, and a simple approach to policy analysis can greatly assist one's understanding and involvement in policy work.

  16. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  17. Testing the Application for Analyzing Structured Entities

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2011-01-01

    Full Text Available The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the application on components and as a whole are established. A testing strategy for different objectives is proposed. The behavior of users during the testing period is analyzed. Statistical analysis regarding the behavior of users in processes of infinite resources access are realized.

  18. A new approach to analyzing vulnerability

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.; Carlson, R.L.; Riedeman, G.W.

    1986-01-01

    The Westinghouse Hanford Company (WHC) has recently completed construction of the Fuel Cycle Plant (FCP) at Richland, Washington. At start-up the facility will fabricate driver fuel for the Fast Flux Test Facility in the Secure Automated Fabrication line. After construction completion, but before facility certification, the Department of Energy (DOE) Richland Operation Office requested that a vulnerability analysis be performed which assumed multiple insiders as a threat to the security system. A unique method of analyzing facility vulnerabilities was developed at the Security Applications Center (SAC), which is managed by WHC for DOE. The method that was developed verifies a previous vulnerability assessment, as well as introducing a modeling technique which analyzes security alarms in relation to delaying factors and possible insider activities. With this information it is possible to assess the relative strength or weakness of various possible routes to and from a target within a facility,

  19. Real-time airborne particle analyzer

    Science.gov (United States)

    Reilly, Peter T.A.

    2012-10-16

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  20. Development of a nuclear plant analyzer (NPA)

    International Nuclear Information System (INIS)

    De Vlaminck, M.; Mampaey, L.; Vanhoenacker, L.; Bastenaire, F.

    1990-01-01

    A Nuclear Plant Analyzer has been developed by TRACTABEL. Three distinct functional units make up the Nuclear Plant Analyser, a model builder, a run time unit and an analysis unit. The model builder is intended to build simulation models which describe on the one hand the geometric structure and initial conditions of a given plant and on the other hand command control logics and reactor protection systems. The run time unit carries out dialog between the user and the thermal-hydraulic code. The analysis unit is aimed at deep analyzing of the transient results. The model builder is being tested in the framework of the International Standard Problem ISP-26, which is the simulation of a LOCA on the Japanese ROSA facility

  1. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  2. Neutral Particle Analyzer Diagnostic on NSTX

    International Nuclear Information System (INIS)

    Medley, S.S.; Roquemore, A.L.

    2004-01-01

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector

  3. Analyzing Gender Stereotyping in Bollywood Movies

    OpenAIRE

    Madaan, Nishtha; Mehta, Sameep; Agrawaal, Taneea S; Malhotra, Vrinda; Aggarwal, Aditi; Saxena, Mayank

    2017-01-01

    The presence of gender stereotypes in many aspects of society is a well-known phenomenon. In this paper, we focus on studying such stereotypes and bias in Hindi movie industry (Bollywood). We analyze movie plots and posters for all movies released since 1970. The gender bias is detected by semantic modeling of plots at inter-sentence and intra-sentence level. Different features like occupation, introduction of cast in text, associated actions and descriptions are captured to show the pervasiv...

  4. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  5. A seal analyzer for testing container integrity

    International Nuclear Information System (INIS)

    McDaniel, P.; Jenkins, C.

    1988-01-01

    This paper reports on the development of laboratory and production seal analyzer that offers a rapid, nondestructive method of assuring the seal integrity of virtually any type of single or double sealed container. The system can test a broad range of metal cans, drums and trays, membrane-lidded vessels, flexible pouches, aerosol containers, and glass or metal containers with twist-top lids that are used in the chemical/pesticide (hazardous materials/waste), beverage, food, medical and pharmaceutical industries

  6. Information decomposition method to analyze symbolical sequences

    International Nuclear Information System (INIS)

    Korotkov, E.V.; Korotkova, M.A.; Kudryashov, N.A.

    2003-01-01

    The information decomposition (ID) method to analyze symbolical sequences is presented. This method allows us to reveal a latent periodicity of any symbolical sequence. The ID method is shown to have advantages in comparison with application of the Fourier transformation, the wavelet transform and the dynamic programming method to look for latent periodicity. Examples of the latent periods for poetic texts, DNA sequences and amino acids are presented. Possible origin of a latent periodicity for different symbolical sequences is discussed

  7. Analyzing the Existing Undergraduate Engineering Leadership Skills

    OpenAIRE

    Hamed M. Almalki; Luis Rabelo; Charles Davis; Hammad Usmani; Debra Hollister; Alfonso Sarmiento

    2016-01-01

    Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been s...

  8. General methods for analyzing bounded proportion data

    OpenAIRE

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  9. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  10. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  11. Testing the Application for Analyzing Structured Entities

    OpenAIRE

    Ion IVAN; Bogdan VINTILA

    2011-01-01

    The paper presents the testing process of the application for the analysis of structured text entities. The structured entities are presented. Quality characteristics of structured entities are identified and analyzed. The design and building processes are presented. Rules for building structured entities are described. The steps of building the application for the analysis of structured text entities are presented. The objective of the testing process is defined. Ways of testing the applicat...

  12. Semantic analyzability in children's understanding of idioms.

    Science.gov (United States)

    Gibbs, R W

    1991-06-01

    This study investigated the role of semantic analyzability in children's understanding of idioms. Kindergartners and first, third, and fourth graders listened to idiomatic expressions either alone or at the end of short story contexts. Their task was to explain verbally the intended meanings of these phrases and then to choose their correct idiomatic interpretations. The idioms presented to the children differed in their degree of analyzability. Some idioms were highly analyzable or decomposable, with the meanings of their parts contributing independently to their overall figurative meanings. Other idioms were nondecomposable because it was difficult to see any relation between a phrase's individual components and the idiom's figurative meaning. The results showed that younger children (kindergartners and first graders) understood decomposable idioms better than they did nondecomposable phrases. Older children (third and fourth graders) understood both kinds of idioms equally well in supporting contexts, but were better at interpreting decomposable idioms than they were at understanding nondecomposable idioms without contextual information. These findings demonstrate that young children better understand idiomatic phrases whose individual parts independently contribute to their overall figurative meanings.

  13. Handheld Fluorescence Microscopy based Flow Analyzer.

    Science.gov (United States)

    Saxena, Manish; Jayakumar, Nitin; Gorthi, Sai Siva

    2016-03-01

    Fluorescence microscopy has the intrinsic advantages of favourable contrast characteristics and high degree of specificity. Consequently, it has been a mainstay in modern biological inquiry and clinical diagnostics. Despite its reliable nature, fluorescence based clinical microscopy and diagnostics is a manual, labour intensive and time consuming procedure. The article outlines a cost-effective, high throughput alternative to conventional fluorescence imaging techniques. With system level integration of custom-designed microfluidics and optics, we demonstrate fluorescence microscopy based imaging flow analyzer. Using this system we have imaged more than 2900 FITC labeled fluorescent beads per minute. This demonstrates high-throughput characteristics of our flow analyzer in comparison to conventional fluorescence microscopy. The issue of motion blur at high flow rates limits the achievable throughput in image based flow analyzers. Here we address the issue by computationally deblurring the images and show that this restores the morphological features otherwise affected by motion blur. By further optimizing concentration of the sample solution and flow speeds, along with imaging multiple channels simultaneously, the system is capable of providing throughput of about 480 beads per second.

  14. A Raman-Based Portable Fuel Analyzer

    Science.gov (United States)

    Farquharson, Stuart

    2010-08-01

    Fuel is the single most import supply during war. Consider that the US Military is employing over 25,000 vehicles in Iraq and Afghanistan. Most fuel is obtained locally, and must be characterized to ensure proper operation of these vehicles. Fuel properties are currently determined using a deployed chemical laboratory. Unfortunately, each sample requires in excess of 6 hours to characterize. To overcome this limitation, we have developed a portable fuel analyzer capable of determine 7 fuel properties that allow determining fuel usage. The analyzer uses Raman spectroscopy to measure the fuel samples without preparation in 2 minutes. The challenge, however, is that as distilled fractions of crude oil, all fuels are composed of hundreds of hydrocarbon components that boil at similar temperatures, and performance properties can not be simply correlated to a single component, and certainly not to specific Raman peaks. To meet this challenge, we measured over 800 diesel and jet fuels from around the world and used chemometrics to correlate the Raman spectra to fuel properties. Critical to the success of this approach is laser excitation at 1064 nm to avoid fluorescence interference (many fuels fluoresce) and a rugged interferometer that provides 0.1 cm-1 wavenumber (x-axis) accuracy to guarantee accurate correlations. Here we describe the portable fuel analyzer, the chemometric models, and the successful determination of these 7 fuel properties for over 100 unknown samples provided by the US Marine Corps, US Navy, and US Army.

  15. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    Fasching, G.E.; Patton, G.H.

    1975-01-01

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  16. A incorporação de novas tecnologias nos serviços de saúde: o desafio da análise dos fatores em jogo Adoption of new technologies by health services: the challenge of analyzing relevant factors

    Directory of Open Access Journals (Sweden)

    Evelinda Trindade

    2008-05-01

    Full Text Available A dinâmica exponencial de incorporação tecnológica na saúde tem sido considerada como uma das razões para o crescimento dos gastos do setor. Estas decisões envolvem múltiplos níveis e stakeholders. A descentralização multiplicou os níveis de decisão, com difíceis escolhas múltiplas e recursos restritos. A inter-relação entre os atores é complexa, em sistemas criativos com múltiplos determinantes e fatores de confusão. Esta revisão discute a interação entre os fatores que influenciam as decisões de incorporação de tecnologias nos serviços de saúde e propõe uma estrutura para sua análise. A aplicação e intensidade desses fatores nos processos de decisão de incorporação de produtos e programas nos serviços de saúde conformam a capacidade instalada nas redes locais e regionais e modifica o sistema de saúde. A observação empírica dos processos de decisão de incorporação tecnológica nos serviços de saúde do Brasil constitui um desafio importante. O reconhecimento estruturado e dimensionamento destas variáveis podem auxiliar a melhorar o planejamento pró-ativo dos serviços de saúde.The exponential increase in the incorporation of health technologies has been considered a key factor in increased expenditures by the health sector. Such decisions involve multiple levels and stakeholders. Decentralization has multiplied the decision-making levels, with numerous difficult choices and limited resources. The interrelationship between stakeholders is complex, in creative systems with multiple determinants and confounders. The current review discusses the interaction between the factors influencing the decisions to incorporate technologies by health services, and proposes a structure for their analysis. The application and intensity of these factors in decision-making and the incorporation of products and programs by health services shapes the installed capacity of local and regional networks and modifies the

  17. Analyzing critical material demand: A revised approach.

    Science.gov (United States)

    Nguyen, Ruby Thuy; Fishman, Tomer; Zhao, Fu; Imholte, D D; Graedel, T E

    2018-07-15

    Apparent consumption has been widely used as a metric to estimate material demand. However, with technology advancement and complexity of material use, this metric has become less useful in tracking material flows, estimating recycling feedstocks, and conducting life cycle assessment of critical materials. We call for future research efforts to focus on building a multi-tiered consumption database for the global trade network of critical materials. This approach will help track how raw materials are processed into major components (e.g., motor assemblies) and eventually incorporated into complete pieces of equipment (e.g., wind turbines). Foreseeable challenges would involve: 1) difficulty in obtaining a comprehensive picture of trade partners due to business sensitive information, 2) complexity of materials going into components of a machine, and 3) difficulty maintaining such a database. We propose ways to address these challenges such as making use of digital design, learning from the experience of building similar databases, and developing a strategy for financial sustainability. We recommend that, with the advancement of information technology, small steps toward building such a database will contribute significantly to our understanding of material flows in society and the associated human impacts on the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Artificial intelligence for analyzing orthopedic trauma radiographs.

    Science.gov (United States)

    Olczak, Jakub; Fahlberg, Niklas; Maki, Atsuto; Razavian, Ali Sharif; Jilert, Anthony; Stark, André; Sköldenberg, Olof; Gordon, Max

    2017-12-01

    Background and purpose - Recent advances in artificial intelligence (deep learning) have shown remarkable performance in classifying non-medical images, and the technology is believed to be the next technological revolution. So far it has never been applied in an orthopedic setting, and in this study we sought to determine the feasibility of using deep learning for skeletal radiographs. Methods - We extracted 256,000 wrist, hand, and ankle radiographs from Danderyd's Hospital and identified 4 classes: fracture, laterality, body part, and exam view. We then selected 5 openly available deep learning networks that were adapted for these images. The most accurate network was benchmarked against a gold standard for fractures. We furthermore compared the network's performance with 2 senior orthopedic surgeons who reviewed images at the same resolution as the network. Results - All networks exhibited an accuracy of at least 90% when identifying laterality, body part, and exam view. The final accuracy for fractures was estimated at 83% for the best performing network. The network performed similarly to senior orthopedic surgeons when presented with images at the same resolution as the network. The 2 reviewer Cohen's kappa under these conditions was 0.76. Interpretation - This study supports the use for orthopedic radiographs of artificial intelligence, which can perform at a human level. While current implementation lacks important features that surgeons require, e.g. risk of dislocation, classifications, measurements, and combining multiple exam views, these problems have technical solutions that are waiting to be implemented for orthopedics.

  19. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2003-01-01

    This study analyzes how a group of ‘mediators’ in a large, multinational company adapted a computer-mediated communication technology (a ‘virtual workspace’) to the organizational context (and vice versa) by modifying features of the technology, providing ongoing support for users, and promoting...... appropriate conventions of use. Our findings corroborate earlier research on technology-use mediation, which suggests that such mediators can exert considerable influence on how a particular technology will be established and used in an organization. However, this study also indicates that the process...... of technology-use mediation is more complex and indeterminate than earlier literature suggests. In particular, we want to draw attention to the fact that advanced computer-mediated communication technologies are equivocal and that technology-use mediation consequently requires ongoing sensemaking (Weick 1995)....

  20. Mass spectrometer calibration of Cosmic Dust Analyzer

    Science.gov (United States)

    Ahrens, Thomas J.; Gupta, Satish C.; Jyoti, G.; Beauchamp, J. L.

    2003-02-01

    The time-of-flight (TOF) mass spectrometer (MS) of the Cosmic Dust Analyzer (CDA) instrument aboard the Cassini spacecraft is expected to be placed in orbit about Saturn to sample submicrometer-diameter ring particles and impact ejecta from Saturn's satellites. The CDA measures a mass spectrum of each particle that impacts the chemical analyzer sector of the instrument. Particles impact a Rh target plate at velocities of 1-100 km/s and produce some 10-8 to 10-5 times the particle mass of positive valence, single-charged ions. These are analyzed via a TOF MS. Initial tests employed a pulsed N2 laser acting on samples of kamacite, pyrrhotite, serpentine, olivine, and Murchison meteorite induced bursts of ions which were detected with a microchannel plate and a charge sensitive amplifier (CSA). Pulses from the N2 laser (1011 W/cm2) are assumed to simulate particle impact. Using aluminum alloy as a test sample, each pulse produces a charge of ~4.6 pC (mostly Al+1), whereas irradiation of a stainless steel target produces a ~2.8 pC (Fe+1) charge. Thus the present system yields ~10-5% of the laser energy in resulting ions. A CSA signal indicates that at the position of the microchannel plate, the ion detector geometry is such that some 5% of the laser-induced ions are collected in the CDA geometry. Employing a multichannel plate detector in this MS yields for Al-Mg-Cu alloy and kamacite targets well-defined peaks at 24 (Mg+1), 27(Al+1), and 64 (Cu+1) and 56 (Fe+1), 58 (Ni+1), and 60 (Ni+1) dalton, respectively.

  1. IRISpy: Analyzing IRIS Data in Python

    Science.gov (United States)

    Ryan, Daniel; Christe, Steven; Mumford, Stuart; Baruah, Ankit; Timothy, Shelbe; Pereira, Tiago; De Pontieu, Bart

    2017-08-01

    IRISpy is a new community-developed open-source software library for analysing IRIS level 2 data. It is written in Python, a free, cross-platform, general-purpose, high-level programming language. A wide array of scientific computing software packages have already been developed in Python, from numerical computation (NumPy, SciPy, etc.), to visualization and plotting (matplotlib), to solar-physics-specific data analysis (SunPy). IRISpy is currently under development as a SunPy-affiliated package which means it depends on the SunPy library, follows similar standards and conventions, and is developed with the support of of the SunPy development team. IRISpy’s has two primary data objects, one for analyzing slit-jaw imager data and another for analyzing spectrograph data. Both objects contain basic slicing, indexing, plotting, and animating functionality to allow users to easily inspect, reduce and analyze the data. As part of this functionality the objects can output SunPy Maps, TimeSeries, Spectra, etc. of relevant data slices for easier inspection and analysis. Work is also ongoing to provide additional data analysis functionality including derivation of systematic measurement errors (e.g. readout noise), exposure time correction, residual wavelength calibration, radiometric calibration, and fine scale pointing corrections. IRISpy’s code base is publicly available through github.com and can be contributed to by anyone. In this poster we demonstrate IRISpy’s functionality and future goals of the project. We also encourage interested users to become involved in further developing IRISpy.

  2. A low power Multi-Channel Analyzer

    International Nuclear Information System (INIS)

    Anderson, G.A.; Brackenbush, L.W.

    1993-06-01

    The instrumentation used in nuclear spectroscopy is generally large, is not portable, and requires a lot of power. Key components of these counting systems are the computer and the Multi-Channel Analyzer (MCA). To assist in performing measurements requiring portable systems, a small, very low power MCA has been developed at Pacific Northwest Laboratory (PNL). This MCA is interfaced with a Hewlett Packard palm top computer for portable applications. The MCA can also be connected to an IBM/PC for data storage and analysis. In addition, a real-time time display mode allows the user to view the spectra as they are collected

  3. The SPAR thermal analyzer: Present and future

    Science.gov (United States)

    Marlowe, M. B.; Whetstone, W. D.; Robinson, J. C.

    The SPAR thermal analyzer, a system of finite-element processors for performing steady-state and transient thermal analyses, is described. The processors communicate with each other through the SPAR random access data base. As each processor is executed, all pertinent source data is extracted from the data base and results are stored in the data base. Steady state temperature distributions are determined by a direct solution method for linear problems and a modified Newton-Raphson method for nonlinear problems. An explicit and several implicit methods are available for the solution of transient heat transfer problems. Finite element plotting capability is available for model checkout and verification.

  4. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  5. Analyzing water/wastewater infrastructure interdependencies

    International Nuclear Information System (INIS)

    Gillette, J. L.; Fisher, R. E.; Peerenboom, J. P.; Whitfield, R. G.

    2002-01-01

    This paper describes four general categories of infrastructure interdependencies (physical, cyber, geographic, and logical) as they apply to the water/wastewater infrastructure, and provides an overview of one of the analytic approaches and tools used by Argonne National Laboratory to evaluate interdependencies. Also discussed are the dimensions of infrastructure interdependency that create spatial, temporal, and system representation complexities that make analyzing the water/wastewater infrastructure particularly challenging. An analytical model developed to incorporate the impacts of interdependencies on infrastructure repair times is briefly addressed

  6. Analyzing Argumentation In Rich, Natural Contexts

    Directory of Open Access Journals (Sweden)

    Anita Reznitskaya

    2008-02-01

    Full Text Available The paper presents the theoretical and methodological aspects of research on the development of argument- ation in elementary school children. It presents a theoretical framework detailing psychological mechanisms responsible for the acquisition and transfer of argumentative discourse and demonstrates several applications of the framework, described in sufficient detail to guide future empirical investigations of oral, written, individual, or group argumentation performance. Software programs capable of facilitating data analysis are identified and their uses illustrated. The analytic schemes can be used to analyze large amounts of verbal data with reasonable precision and efficiency. The conclusion addresses more generally the challenges for and possibilities of empirical study of the development of argumentation.

  7. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis.

  8. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  9. ASDA - Advanced Suit Design Analyzer computer program

    Science.gov (United States)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  10. Development of a Portable Water Quality Analyzer

    Directory of Open Access Journals (Sweden)

    Germán COMINA

    2010-08-01

    Full Text Available A portable water analyzer based on a voltammetric electronic tongue has been developed. The system uses an electrochemical cell with two working electrodes as sensors, a computer controlled potentiostat, and software based on multivariate data analysis for pattern recognition. The system is suitable to differentiate laboratory made and real in-situ river water samples contaminated with different amounts of Escherichia coli. This bacteria is not only one of the main indicators for water quality, but also a main concern for public health, affecting especially people living in high-burden, resource-limiting settings.

  11. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  12. Improving respiration measurements with gas exchange analyzers.

    Science.gov (United States)

    Montero, R; Ribas-Carbó, M; Del Saz, N F; El Aou-Ouad, H; Berry, J A; Flexas, J; Bota, J

    2016-12-01

    Dark respiration measurements with open-flow gas exchange analyzers are often questioned for their low accuracy as their low values often reach the precision limit of the instrument. Respiration was measured in five species, two hypostomatous (Vitis Vinifera L. and Acanthus mollis) and three amphistomatous, one with similar amount of stomata in both sides (Eucalyptus citriodora) and two with different stomata density (Brassica oleracea and Vicia faba). CO 2 differential (ΔCO 2 ) increased two-fold with no change in apparent R d , when the two leaves with higher stomatal density faced outside. These results showed a clear effect of the position of stomata on ΔCO 2 . Therefore, it can be concluded that leaf position is important to guarantee the improvement of respiration measurements increasing ΔCO 2 without affecting the respiration results by leaf or mass units. This method will help to increase the accuracy of leaf respiration measurements using gas exchange analyzers. Copyright © 2016 Elsevier GmbH. All rights reserved.

  13. Solar Probe ANalyzer for Ions - Laboratory Performance

    Science.gov (United States)

    Livi, R.; Larson, D. E.; Kasper, J. C.; Korreck, K. E.; Whittlesey, P. L.

    2017-12-01

    The Parker Solar Probe (PSP) mission is a heliospheric satellite that will orbit the Sun closer than any prior mission to date with a perihelion of 35 solar radii (RS) and an aphelion of 10 RS. PSP includes the Solar Wind Electrons Alphas and Protons (SWEAP) instrument suite, which in turn consists of four instruments: the Solar Probe Cup (SPC) and three Solar Probe ANalyzers (SPAN) for ions and electrons. Together, this suite will take local measurements of particles and electromagnetic fields within the Sun's corona. SPAN-Ai has completed flight calibration and spacecraft integration and is set to be launched in July of 2018. The main mode of operation consists of an electrostatic analyzer (ESA) at its aperture followed by a Time-of-Flight section to measure the energy and mass per charge (m/q) of the ambient ions. SPAN-Ai's main objective is to measure solar wind ions within an energy range of 5 eV - 20 keV, a mass/q between 1-60 [amu/q] and a field of view of 2400x1200. Here we will show flight calibration results and performance.

  14. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  15. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  16. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  17. Automatic analyzing device for chlorine ion

    International Nuclear Information System (INIS)

    Sugibayashi, Shinji; Morikawa, Yoshitake; Fukase, Kazuo; Kashima, Hiromasa.

    1997-01-01

    The present invention provides a device of automatically analyzing a trance amount of chlorine ions contained in feedwater, condensate and reactor water of a BWR type power plant. Namely, zero-adjustment or span calibration in this device is conducted as follows. (1) A standard chlorine ion liquid is supplied from a tank to a mixer by a constant volume pump, and the liquid is diluted and mixed with purified water to form a standard liquid. (2) The pH of the standard liquid is adjusted by a pH adjuster. (3) The standard liquid is supplied to an electrode cell to conduct zero adjustment or span calibration. Chlorine ions in a specimen are measured by the device of the present invention as follows. (1) The specimen is supplied to a head tank through a line filter. (2) The pH of the specimen is adjusted by a pH adjuster. (3) The specimen is supplied to an electrode cell to electrically measure the concentration of the chlorine ions in the specimen. The device of the present invention can automatically analyze trance amount of chlorine ions at a high accuracy, thereby capable of improving the sensitivity, reducing an operator's burden and radiation exposure. (I.S.)

  18. Plutonium solution analyzer. Revised February 1995

    International Nuclear Information System (INIS)

    Burns, D.A.

    1995-02-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%--0.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40--240 g/l: and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4--4.0 g/y. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 ml of each sample and standard, and generates waste at the rate of about 1.5 ml per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  19. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  20. Performance evaluation of Samsung LABGEO(HC10) Hematology Analyzer.

    Science.gov (United States)

    Park, Il Joong; Ahn, Sunhyun; Kim, Young In; Kang, Seon Joo; Cho, Sung Ran

    2014-08-01

    The Samsung LABGEO(HC10) Hematology Analyzer (LABGEO(HC10)) is a recently developed automated hematology analyzer that uses impedance technologies. The analyzer provides 18 parameters including 3-part differential at a maximum rate of 80 samples per hour. To evaluate the performance of the LABGEO(HC10). We evaluated precision, linearity, carryover, and relationship for complete blood cell count parameters between the LABGEO(HC10) and the LH780 (Beckman Coulter Inc) in a university hospital in Korea according to the Clinical and Laboratory Standards Institute guidelines. Sample stability and differences due to the anticoagulant used (K₂EDTA versus K₃EDTA) were also evaluated. The LABGEO(HC10) showed linearity over a wide range and minimal carryover ( 0.92) except for mean corpuscular hemoglobin concentration. The bias estimated was acceptable for all parameters investigated except for monocyte count. Most parameters were stable until 24 hours both at room temperature and at 4°C. The difference by anticoagulant type was statistically insignificant for all parameters except for a few red cell parameters. The accurate results achievable and simplicity of operation make the unit recommendable for small to medium-sized laboratories.

  1. Analyzing Solutions High in Total Dissolved Solids for Rare Earth Elements (REEs) Using Cation Exchange and Online Pre-Concentration with the seaFAST2 Unit; NETL-TRS-7-2017; NETL Technical Report Series; U.S. Department of Energy, National Energy Technology Laboratory: Albany, OR, 2017; p 32

    Energy Technology Data Exchange (ETDEWEB)

    Yang, J. [National Energy Technology Lab. (NETL), Albany, OR (United States); Oregon State Univ., Corvallis, OR (United States). College of Earth, Ocean, and Atmospheric Science; Torres, M. [Oregon State Univ., Corvallis, OR (United States). College of Earth, Ocean, and Atmospheric Science; Verba, C. [National Energy Technology Lab. (NETL), Albany, OR (United States); Oregon State Univ., Corvallis, OR (United States); Hakala, A. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States)

    2017-08-01

    The accurate quantification of the rare earth element (REE) dissolved concentrations in natural waters are often inhibited by their low abundances in relation to other dissolved constituents such as alkali, alkaline earth elements, and dissolved solids. The high abundance of these constituents can suppress the overall analytical signal as well as create isobaric interferences on the REEs during analysis. Waters associated with natural gas operations on black shale plays are characterized by high salinities and high total dissolved solids (TDS) contents >150,000 mg/L. Methods used to isolate and quantify dissolved REEs in seawater were adapted in order to develop the capability of analyzing REEs in waters that are high in TDS. First, a synthetic fluid based on geochemical modelling of natural brine formation fluids was created within the Marcellus black shale with a TDS loading of 153,000 mg/L. To this solution, 1,000 ng/mL of REE standards was added based on preliminary analyses of experimental fluids reacted at high pressure and temperature with Marcellus black shale. These synthetic fluids were then run at three different dilution levels of 10, 100, and 1,000–fold dilutions through cation exchange columns using AG50-X8 exchange resin from Eichrom Industries. The eluent from the cation columns were then sent through a seaFAST2 unit directly connected to an inductively coupled plasma mass spectrometer (ICP-MS) to analyze the REEs. Percent recoveries of the REEs ranged from 80–110% and fell within error for the external reference standard used and no signal suppression or isobaric interferences on the REEs were observed. These results demonstrate that a combined use of cation exchange columns and seaFAST2 instrumentation are effective in accurately quantifying the dissolved REEs in fluids that are >150,000 mg/L in TDS and have Ba:Eu ratios in excess of 380,000.

  2. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  3. Toward Sustainable Anticipatory Governance: Analyzing and Assessing Nanotechnology Innovation Processes

    Science.gov (United States)

    Foley, Rider Williams

    Cities around the globe struggle with socio-economic disparities, resource inefficiency, environmental contamination, and quality-of-life challenges. Technological innovation, as one prominent approach to problem solving, promises to address these challenges; yet, introducing new technologies, such as nanotechnology, into society and cities has often resulted in negative consequences. Recent research has conceptually linked anticipatory governance and sustainability science: to understand the role of technology in complex problems our societies face; to anticipate negative consequences of technological innovation; and to promote long-term oriented and responsible governance of technologies. This dissertation advances this link conceptually and empirically, focusing on nanotechnology and urban sustainability challenges. The guiding question for this dissertation research is: How can nanotechnology be innovated and governed in responsible ways and with sustainable outcomes? The dissertation: analyzes the nanotechnology innovation process from an actor- and activities-oriented perspective (Chapter 2); assesses this innovation process from a comprehensive perspective on sustainable governance (Chapter 3); constructs a small set of future scenarios to consider future implications of different nanotechnology governance models (Chapter 4); and appraises the amenability of sustainability problems to nanotechnological interventions (Chapter 5). The four studies are based on data collected through literature review, document analysis, participant observation, interviews, workshops, and walking audits, as part of process analysis, scenario construction, and technology assessment. Research was conducted in collaboration with representatives from industry, government agencies, and civic organizations. The empirical parts of the four studies focus on Metropolitan Phoenix. Findings suggest that: predefined mandates and economic goals dominate the nanotechnology innovation process

  4. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  5. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  6. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    Snider, D.M.; Wagner, K.L.; Grush, W.H.; Jones, K.R.

    1995-01-01

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  7. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    Ehrlich, B.J.; Hall, R.C.; Thiede, P.W.

    1979-01-01

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  8. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  9. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  10. Analyzing Demand: Hegemonic Masculinity and Feminine Prostitution

    Directory of Open Access Journals (Sweden)

    Beatriz Ranea Triviño

    2016-12-01

    Full Text Available In this article, it is presented an exploratory research in which we analyzed the relationship between the construction of hegemonic masculinity and consumption of female prostitution. We have focused our attention on the experiences, attitudes and perceptions of young heterosexual men who have ever paid for sex. Following with a quantitative method of analysis, we conducted six semi-structured interviews with men between 18 to 35 years old. The analysis of the interviews shows the different demographic characteristics, such as, frequency of payment for sexual services, diversity of motivations, spaces where prostitutes are searched, opinions on prostitution and prostitutes. The main conclusions of this study are that the discourses of the interviewees reproduce gender stereotypes and gender sexual roles. And it is suggested that prostitution can be interpreted as a scenario where these men performance their hegemonic masculinity.

  11. Using wavelet features for analyzing gamma lines

    International Nuclear Information System (INIS)

    Medhat, M.E.; Abdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Uzhinskii, V.V.

    2004-01-01

    Data processing methods for analyzing gamma ray spectra with symmetric bell-shaped peaks form are considered. In many cases the peak form is symmetrical bell shaped in particular a Gaussian case is the most often used due to many physical reasons. The problem is how to evaluate parameters of such peaks, i.e. their positions, amplitudes and also their half-widths, that is for a single peak and overlapped peaks. Through wavelet features by using Marr wavelet (Mexican Hat) as a correlation method, it could be to estimate the optimal wavelet parameters and to locate peaks in the spectrum. The performance of the proposed method and others shows a better quality of wavelet transform method

  12. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    Rojas S, A.S.; Carrillo M, R.A.; Balderas, E.G.

    1999-01-01

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  13. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-10-01

    The Nuclear Plant Analyzer (NPA) is being developed as the US Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  14. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  15. Nuclear plant analyzer development at INEL

    International Nuclear Information System (INIS)

    Laats, E.T.; Russell, K.D.; Stewart, H.D.

    1983-01-01

    The Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC) has sponsored development of a software-hardware system called the Nuclear Plant Analyzer (NPA). This paper describes the status of the NPA project at the INEL after one year of development. When completed, the NPA will be an integrated network of analytical tools for performing reactor plant analyses. Development of the NPA in FY-1983 progressed along two parallel pathways; namely, conceptual planning and software development. Regarding NPA planning, and extensive effort was conducted to define the function requirements of the NPA, conceptual design, and hardware needs. Regarding software development conducted in FY-1983, all development was aimed toward demonstrating the basic concept and feasibility of the NPA. Nearly all software was developed and resides on the INEL twin Control Data Corporation 176 mainframe computers

  16. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  17. Structural factoring approach for analyzing stochastic networks

    Science.gov (United States)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  18. RELAP5 nuclear plant analyzer capabilities

    International Nuclear Information System (INIS)

    Wagner, R.J.; Ransom, V.H.

    1982-01-01

    An interactive execution capability has been developed for the RELAP5 code which permits it to be used as a Nuclear Plant Analyzer. This capability has been demonstrated using a simplified primary and secondary loop model of a PWR. A variety of loss-of-feed-water accidents have been simulated using this model. The computer execution time on a CDC Cyber 176 is one half of the transient simulation time so that the results can be displayed in real time. The results of the demonstration problems are displayed in digital form on a color schematic of the plant model using a Textronics 4027 CRT terminal. The interactive feature allows the user to enter commands in much the same manner as a reactor operator

  19. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  20. Diffractive interference optical analyzer (DiOPTER)

    Science.gov (United States)

    Sasikumar, Harish; Prasad, Vishnu; Pal, Parama; Varma, Manoj M.

    2016-03-01

    This report demonstrates a method for high-resolution refractometric measurements using, what we have termed as, a Diffractive Interference Optical Analyzer (DiOpter). The setup consists of a laser, polarizer, a transparent diffraction grating and Si-photodetectors. The sensor is based on the differential response of diffracted orders to bulk refractive index changes. In these setups, the differential read-out of the diffracted orders suppresses signal drifts and enables time-resolved determination of refractive index changes in the sample cell. A remarkable feature of this device is that under appropriate conditions, the measurement sensitivity of the sensor can be enhanced by more than two orders of magnitude due to interference between multiply reflected diffracted orders. A noise-equivalent limit of detection (LoD) of 6x10-7 RIU was achieved in glass. This work focuses on devices with integrated sample well, made on low-cost PDMS. As the detection methodology is experimentally straightforward, it can be used across a wide array of applications, ranging from detecting changes in surface adsorbates via binding reactions to estimating refractive index (and hence concentration) variations in bulk samples. An exciting prospect of this technique is the potential integration of this device to smartphones using a simple interface based on transmission mode configuration. In a transmission configuration, we were able to achieve an LoD of 4x10-4 RIU which is sufficient to explore several applications in food quality testing and related fields. We are envisioning the future of this platform as a personal handheld optical analyzer for applications ranging from environmental sensing to healthcare and quality testing of food products.

  1. Fuel Cell and Hydrogen Technologies Program | Hydrogen and Fuel Cells |

    Science.gov (United States)

    NREL Fuel Cell and Hydrogen Technologies Program Fuel Cell and Hydrogen Technologies Program Through its Fuel Cell and Hydrogen Technologies Program, NREL researches, develops, analyzes, and validates fuel cell and hydrogen production, delivery, and storage technologies for transportation

  2. Educational and Scientific Applications of Climate Model Diagnostic Analyzer

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.

    2016-12-01

    Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of

  3. Relativistic effects in the calibration of electrostatic electron analyzers. I. Toroidal analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Keski Rahkonen, O [Helsinki University of Technology, Espoo (Finland). Laboratory of Physics; Krause, M O [Oak Ridge National Lab., Tenn. (USA)

    1978-02-01

    Relativistic correction terms up to the second order are derived for the kinetic energy of an electron travelling along the circular central trajectory of a toroidal analyzer. Furthermore, a practical energy calibration equation of the spherical sector plate analyzer is written for the variable-plate-voltage recording mode. Accurate measurements with a spherical analyzer performed using kinetic energies from 600 to 2100 eV are in good agreement with this theory showing our approximation (neglect of fringing fields, and source and detector geometry) is realistic enough for actual calibration purposes.

  4. Plasma diagnostics with a retarding potential analyzer

    International Nuclear Information System (INIS)

    Jack, T.M.

    1996-01-01

    The plasma rocket is located at NASA Johnson Space Center. To produce a thrust in space, an inert gas is ionized into a plasma and heated in the linear section of a tokamak fusion device. The magnetic field used to contain the plasma has a magnitude of 2--10 kGauss. The plasma plume has a variable thrust and specific impulse. A high temperature retarding potential analyzer (RPA) is being developed to characterize the plasma in the plume and at the edge of the magnetically contained plasma. The RPA measures the energy and density of ions or electrons entering into its solid angle of collection. An oscilloscope displays the ion flux versus the collected current. All measurements are made relative to the facility ground. Testing of this device involves the determination of its output parameters, sensitivity, and responses to a wide range of energies and densities. Each grid will be tested individually by changing only its voltage and observing the output from the RPA. To verify that the RPA is providing proper output, it is compared to the output from a Langmuir or Faraday probe

  5. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  6. Alternative approach to analyzing occupational mortality data

    International Nuclear Information System (INIS)

    Gilbert, E.S.; Buchanan, J.A.

    1984-01-01

    It is widely recognized that analyzing occupational mortality by calculating standardized mortality ratios based on death rates from the general population is subject to a number of limitations. An alternative approach described in this report takes advantage of the fact that comparisons of mortality by subgroups and assessments of trends in mortality are often of equal or greater interest than overall assessments and that such comparisons do not require an external control. A computer program MOX (Mortality and Occupational Exposure) is available for performing the needed calculations for several diseases. MOX was written to asses the effect of radiation exposure on Hanford nuclear workers. For this application, analyses have been based on cumulative exposure computed (by MOX) from annual records of radiation exposure obtained from personal dosimeter readings. This program provides tests for differences and trends among subcategories defined by variables such as length of employment, job category, or exposure measurements and also provides control for age, calendar year, and several other potentially confounding variables. 29 references, 2 tables

  7. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  8. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    Laats, E.T.

    1987-01-01

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  9. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  10. Analyzing the Existing Undergraduate Engineering Leadership Skills

    Directory of Open Access Journals (Sweden)

    Hamed M. Almalki

    2016-12-01

    Full Text Available Purpose: Studying and analyzing the undergraduate engineering students' leadership skills to discover their potential leadership strengths and weaknesses. This study will unveil potential ways to enhance the ways we teach engineering leadership. The research has great insights that might assist engineering programs to improve curricula for the purpose of better engineering preparation to meet industry's demands. Methodology and Findings: 441 undergraduate engineering students have been surveyed in two undergraduate engineering programs to discover their leadership skills. The results in both programs were revealing that undergraduate engineering students are lacking behind in the visionary leadership skills compared to directing, including and cultivating leadership styles. Recommendation: A practical framework has been proposed to enhance the lacking leadership skills by utilizing the Matrix of Change (MOC, and the Balanced Scorecard BSC to capture the best leadership scenarios to design virtual simulation environment as per the lacking leadership skills which is the visionary leadership skills in this case. After that, the virtual simulation will be used to provide an experiential learning by replacing human beings with avatars that can be managed or dramatized by real people to enable the creation of live, practical, measurable, and customizable leadership development programs.

  11. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  12. A Methodology to Analyze Photovoltaic Tracker Uptime

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Matthew T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Dan [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-17

    A metric is developed to analyze the daily performance of single-axis photovoltaic (PV) trackers. The metric relies on comparing correlations between the daily time series of the PV power output and an array of simulated plane-of-array irradiances for the given day. Mathematical thresholds and a logic sequence are presented, so the daily tracking metric can be applied in an automated fashion on large-scale PV systems. The results of applying the metric are visually examined against the time series of the power output data for a large number of days and for various systems. The visual inspection results suggest that overall, the algorithm is accurate in identifying stuck or functioning trackers on clear-sky days. Visual inspection also shows that there are days that are not classified by the metric where the power output data may be sufficient to identify a stuck tracker. Based on the daily tracking metric, uptime results are calculated for 83 different inverters at 34 PV sites. The mean tracker uptime is calculated at 99% based on 2 different calculation methods. The daily tracking metric clearly has limitations, but as there is no existing metrics in the literature, it provides a valuable tool for flagging stuck trackers.

  13. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  14. Development of a PDA Based Portable Pulse Height Analyzer System

    International Nuclear Information System (INIS)

    Mankheed, Panuphong; Ngernvijit, Narippawaj; Thong-Aram, Decho

    2007-08-01

    Full text: In this research a portable pulse height analyzer system was developed by application of a Personal Digital Assistant (PDAs) palm Tungsten T model together with Single Chip SCA developed by Department of Nuclear Technology, Chulalongkorn University to be used for education and research works. Capability of the developed system could measure both the energy and the average count rate of gamma rays. The results of this research showed that the gamma energy spectrum analysis of the developed system with a 2? x 2? NaI(Tl) detector could display photo peaks of Cs-137 and Co-60 at channel 57, channel 103, and channel 117 respectively. The energy resolution was found to be 7.14% at energy 661.66 keV of Cs-137

  15. Discourses of Technology

    DEFF Research Database (Denmark)

    Sommer, Jannek K.; Knudsen, Gry Høngsmark

    In this poster we address consumption of technology from the perspective of failure. A large body of studies of consumption of technology have focused on consumer acceptance (Kozinets, 2008). These studies have identified particular narratives about social and economic progress, and pleasure...... (Kozinets, 2008) as drivers of consumer acceptance of new technology. Similarly, Giesler (2008) has conceptualized consumer acceptance of technology as a form of marketplace drama, in which market ideologies are negotiated between consumers and media discourses. We suggest to study discourses around failed...... technology products to explore the negotiation of the familiar and alien that makes consumers reject or embrace a new technology. Thus, this particular project sets out to analyze consumer discourses surrounding the Google Glass video “How it Feels [through Google Glass]” on YouTube, because we want...

  16. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  17. Modeling and Analyzing Academic Researcher Behavior

    Directory of Open Access Journals (Sweden)

    Phuc Huu Nguyen

    2016-12-01

    Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.

  18. Nuclear plant analyzer program for Bulgaria

    International Nuclear Information System (INIS)

    Shier, W.; Kennett, R.

    1993-01-01

    An interactive nuclear plant analyzer(NPA) has been developed for use by the Bulgarian technical community in the training of plant personnel, the development and verification of plant operating procedures, and in the analysis of various anticipated operational occurrences and accident scenarios. The current NPA includes models for a VVER-440 Model 230 and a VVER-1000 Model 320 and is operational on an IBM RISC6000 workstation. The RELAP5/MOD2 computer code has been used for the calculation of the reactor responses to the interactive commands initiated by the NPA operator. The interactive capabilities of the NPA have been developed to provide considerable flexibility in the plant actions that can be initiated by the operator. The current capabilities for both the VVER-440 and VVER-1000 models include: (1) scram initiation; (2) reactor coolant pump trip; (3) high pressure safety injection system initiation; (4) low pressure safety injection system initiation; (5) pressurizer safety valve opening; (6) steam generator relief/safety valve opening; (7) feedwater system initiation and trip; (8) turbine trip; and (9) emergency feedwater initiation. The NPA has the capability to display the results of the simulations in various forms that are determined by the model developer. Results displayed on the reactor mask are shown through the user defined, digital display of various plant parameters and through color changes that reflect changes in primary system fluid temperatures, fuel and clad temperatures, and the temperature of other metal structures. In addition, changes in the status of various components and systems can be initiated and/or displayed both numerically and graphically on the mask. This paper provides a description of the structure of the NPA, a discussion of the simulation models used for the VVER-440 and the VVER-1000, and an overview of the NPA capabilities. Typical results obtained using both simulation models will be discussed

  19. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  20. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  1. Sport Technology

    CSIR Research Space (South Africa)

    Kirkbride, T

    2007-11-01

    Full Text Available Technology is transforming the games themselves and at times with dire consequences. Tony Kirkbride, Head: CSIR Technology Centre said there are a variety of sports technologies and there have been advances in material sciences and advances...

  2. Assistive Technology

    Science.gov (United States)

    ... Page Resize Text Printer Friendly Online Chat Assistive Technology Assistive technology (AT) is any service or tool that helps ... be difficult or impossible. For older adults, such technology may be a walker to improve mobility or ...

  3. Thromboelastography platelet mapping in healthy dogs using 1 analyzer versus 2 analyzers.

    Science.gov (United States)

    Blois, Shauna L; Banerjee, Amrita; Wood, R Darren; Park, Fiona M

    2013-07-01

    The objective of this study was to describe the results of thromboelastography platelet mapping (TEG-PM) carried out using 2 techniques in 20 healthy dogs. Maximum amplitudes (MA) generated by thrombin (MAthrombin), fibrin (MAfibrin), adenosine diphosphate (ADP) receptor activity (MAADP), and thromboxane A2 (TxA2) receptor activity (stimulated by arachidonic acid, MAAA) were recorded. Thromboelastography platelet mapping was carried out according to the manufacturer's guidelines (2-analyzer technique) and using a variation of this method employing only 1 analyzer (1-analyzer technique) on 2 separate blood samples obtained from each dog. Mean [± standard deviation (SD)] MA values for the 1-analyzer/2-analyzer techniques were: MAthrombin = 51.9 mm (± 7.1)/52.5 mm (± 8.0); MAfibrin = 20.7 mm (± 21.8)/23.0 mm (± 26.1); MAADP = 44.5 mm (± 15.6)/45.6 mm (± 17.0); and MAAA = 45.7 mm (± 11.6)/45.0 mm (± 15.4). Mean (± SD) percentage aggregation due to ADP receptor activity was 70.4% (± 32.8)/67.6% (± 33.7). Mean percentage aggregation due to TxA2 receptor activity was 77.3% (± 31.6)/78.1% (± 50.2). Results of TEG-PM were not significantly different for the 1-analyzer and 2-analyzer methods. High correlation was found between the 2 methods for MAfibrin [concordance correlation coefficient (r) = 0.930]; moderate correlation was found for MAthrombin (r = 0.70) and MAADP (r = 0.57); correlation between the 2 methods for MAAA was lower (r = 0.32). Thromboelastography platelet mapping (TEG-PM) should be further investigated to determine if it is a suitable method for measuring platelet dysfunction in dogs with thrombopathy.

  4. Historical civilian nuclear accident based Nuclear Reactor Condition Analyzer

    Science.gov (United States)

    McCoy, Kaylyn Marie

    There are significant challenges to successfully monitoring multiple processes within a nuclear reactor facility. The evidence for this observation can be seen in the historical civilian nuclear incidents that have occurred with similar initiating conditions and sequences of events. Because there is a current lack within the nuclear industry, with regards to the monitoring of internal sensors across multiple processes for patterns of failure, this study has developed a program that is directed at accomplishing that charge through an innovation that monitors these systems simultaneously. The inclusion of digital sensor technology within the nuclear industry has appreciably increased computer systems' capabilities to manipulate sensor signals, thus making the satisfaction of these monitoring challenges possible. One such manipulation to signal data has been explored in this study. The Nuclear Reactor Condition Analyzer (NRCA) program that has been developed for this research, with the assistance of the Nuclear Regulatory Commission's Graduate Fellowship, utilizes one-norm distance and kernel weighting equations to normalize all nuclear reactor parameters under the program's analysis. This normalization allows the program to set more consistent parameter value thresholds for a more simplified approach to analyzing the condition of the nuclear reactor under its scrutiny. The product of this research provides a means for the nuclear industry to implement a safety and monitoring program that can oversee the system parameters of a nuclear power reactor facility, like that of a nuclear power plant.

  5. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  6. analyzers in overweight/obese renal patients

    Directory of Open Access Journals (Sweden)

    Mariusz Kusztal

    2015-05-01

    Full Text Available Bioelectrical impedance analysis (BIA is an affordable, non-invasive and fast alternative method to assess body composition. The purpose of this study was to compare two different tetrapolar BIA devices for estimating body fluid volumes and body cell mass (BCM in a clinical setting among patients with kidney failure.All double measurements were performed by multi-frequency (MF and single-frequency (SF BIA analyzers: a Body Composition Monitor (Fresenius Medical Care, Germany and BIA-101 (Akern, Italy, respectively. All procedures were conducted according to the manufacturers’ instructions (dedicated electrodes, measurement sites, positions, etc. Total body water (TBW, extracellular water (ECW, intracellular water (ICW and BCM were compared. The study included 39 chronic kidney disease patients (stage III-V with a mean age of 45.8 ± 8 years (21 men and 18 women who had a wide range of BMI [17-34 kg/m2 (mean 26.6 ±5].A comparison of results from patients with BMI <25 vs ≥25 revealed a significant discrepancy in measurements between the two BIA devices. Namely, in the group with BMI <25 (n=16 acceptable correlations were obtained in TBW (r 0.99; p<0.01, ICW (0.92; p<0.01, BCM (0.68; p<0.01, and ECW (0.96 p<0.05, but those with BMI ≥25 (n=23 showed a discrepancy (lower correlations in TBW (r 0.82; p<0.05, ICW (0.78; p<0.05, BCM (0.52; p<0.05, and ECW (0.76; p<0.01.Since estimates of TBW, ICW and BCM by the present BIA devices do not differ in patients with BMI <25, they might be interchangeable. This does not hold true for overweight/obese renal patients.

  7. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  8. The Albuquerque Seismological Laboratory Data Quality Analyzer

    Science.gov (United States)

    Ringler, A. T.; Hagerty, M.; Holland, J.; Gee, L. S.; Wilson, D.

    2013-12-01

    The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

  9. Nano technology

    International Nuclear Information System (INIS)

    Lee, In Sik

    2002-03-01

    This book is introduction of nano technology, which describes what nano technology is, alpha and omega of nano technology, the future of Korean nano technology and human being's future and nano technology. The contents of this book are nano period is coming, a engine of creation, what is molecular engineering, a huge nano technology, technique on making small things, nano materials with exorbitant possibility, the key of nano world the most desirable nano technology in bio industry, nano development plan of government, the direction of development for nano technology and children of heart.

  10. Rover Technologies

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop and mature rover technologies supporting robotic exploration including rover design, controlling rovers over time delay and for exploring . Technology...

  11. MIR hollow waveguide (HWG) isotope ratio analyzer for environmental applications

    Science.gov (United States)

    Wang, Zhenyou; Zhuang, Yan; Deev, Andrei; Wu, Sheng

    2017-05-01

    An advanced commercial Mid-InfraRed Isotope Ratio (IR2) analyzer was developed in Arrow Grand Technologies based on hollow waveguide (HWG) as the sample tube. The stable carbon isotope ratio, i.e. δ13C, was obtained by measuring the selected CO2 absorption peaks in the MIR. Combined with a GC and a combustor, it has been successfully employed to measure compound specific δ13C isotope ratios in the field. By using both the 1- pass HWG and 5-path HWG, we are able to measure δ13C isotope ratio at a broad CO2 concentration of 300 ppm-37,500 ppm. Here, we demonstrate its applications in environmental studies. The δ13C isotope ratio and concentration of CO2 exhaled by soil samples was measured in real time with the isotope analyzer. The concentration was found to change with the time. We also convert the Dissolved Inorganic Carbon (DIC) into CO2, and then measure the δ13C isotope ratio with an accuracy of better than 0.3 ‰ (1 σ) with a 6 min test time and 1 ml sample usage. Tap water, NaHCO3 solvent, coca, and even beer were tested. Lastly, the 13C isotope ratio of CO2 exhaled by human beings was obtained <10 seconds after simply blowing the exhaled CO2 into a tube with an accuracy of 0.5‰ (1 σ) without sample preconditioning. In summary, a commercial HWG isotope analyzer was demonstrated to be able to perform environmental and health studies with a high accuracy ( 0.3 ‰/Hz1/2 1 σ), fast sampling rate (up to 10 Hz), low sample consumption ( 1 ml), and broad CO2 concentration range (300 ppm-37,500 ppm).

  12. A Portable, Field-Deployable Analyzer for Isotopic Water Measurements

    Science.gov (United States)

    Berman, E. S.; Gupta, M.; Huang, Y. W.; Lacelle, D.; McKay, C. P.; Fortson, S.

    2015-12-01

    Water stable isotopes have for many years been used to study the hydrological cycle, catchment hydrology, and polar climate among other applications. Typically, discrete water samples are collected and transported to a laboratory for isotope analysis. Due to the expense and labor associated with such sampling, isotope studies have generally been limited in scope and time-resolution. Field sampling of water isotopes has been shown in recent years to provide dense data sets with the increased time resolution illuminating substantially greater short term variability than is generally observed during discrete sampling. A truly portable instrument also opens the possibility to utilize the instrument as a tool for identifying which water samples would be particularly interesting for further laboratory investigation. To make possible such field measurements of liquid water isotopes, Los Gatos Research has developed a miniaturized, field-deployable liquid water isotope analyzer. The prototype miniature liquid water isotope analyzer (mini-LWIA) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology in a rugged, Pelican case housing for easy transport and field operations. The analyzer simultaneously measures both δ2H and δ18O from liquid water, with both manual and automatic water introduction options. The laboratory precision for δ2H is 0.6 ‰, and for δ18O is 0.3 ‰. The mini-LWIA was deployed in the high Arctic during the summer of 2015 at Inuvik in the Canadian Northwest Territories. Samples were collected from Sachs Harbor, on the southwest coast of Banks Island, including buried basal ice from the Lurentide Ice Sheet, some ice wedges, and other types of ground ice. Methodology and water analysis results from this extreme field deployment will be presented.

  13. Skylab medical technology utilization

    Science.gov (United States)

    Stonesifer, J. C.

    1974-01-01

    To perform the extensive medical experimentation on man in a long-term, zero-g environment, new medical measuring and monitoring equipment had to be developed, new techniques in training and operations were required, and new methods of collecting and analyzing the great amounts of medical data were developed. Examples of technology transfers to the public sector resulted from the development of new equipment, methods, techniques, and data. This paper describes several of the examples that stemmed directly from Skylab technology.

  14. Qualitative Education Management Based on Information Technologies

    OpenAIRE

    Natal'ya M. Obolyaeva

    2012-01-01

    The article deals with the qualitative education management through information technologies. Different approaches to defining the quality of education are considered. The interpretation for qualitative assessment of education is analyzed. The qualitative education management in details on the basis of information technologies is shown. The key advantages of appliance such technologies at the institutions of higher learning are analyzed.

  15. Qualitative Education Management Based on Information Technologies

    Directory of Open Access Journals (Sweden)

    Natal'ya M. Obolyaeva

    2012-12-01

    Full Text Available The article deals with the qualitative education management through information technologies. Different approaches to defining the quality of education are considered. The interpretation for qualitative assessment of education is analyzed. The qualitative education management in details on the basis of information technologies is shown. The key advantages of appliance such technologies at the institutions of higher learning are analyzed.

  16. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  17. Survey and research on how large-scale technological development should be in the future; Kongo no ogata gijutsu kaihatsu no hoko ni tsuite no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    Tasks to be subjected to research and development under the large-scale industrial technology research and development system are discussed. Mentioned in the fields of resources and foods are a submarine metal sulfide mining system, a submarine oil development system for ice-covered sea areas, an all-weather type useful vegetable automatic production system, etc. Mentioned in the fields of social development, security, and disaster prevention are a construction work robot, shelter system technologies, disaster control technologies in case of mega-scale disasters, etc. Mentioned in the fields of health, welfare, and education are biomimetics, biosystems, cancer diagnosis and treatment systems, etc. Mentioned in the field of commodity distribution, service, and software are a computer security system, an unmanned collection and distribution system, etc. Mentioned in the field of process conversion are aluminum refining, synzyme technologies for precise synthesis, etc. Mentioned in the field of data processing are optical computers, bioelectronics, etc. Various tasks are pointed out also in the fields of aviation, space, ocean, and machining. (NEDO)

  18. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  19. Essays in technology adoption and corporate finance

    OpenAIRE

    Patel, Pratish

    2013-01-01

    This dissertation consists of three chapters that concern technology adoption and corporate finance. The first chapter analyzes the optimal investment strategy of two firms confronted with the option to adopt a new technology. The second chapter analyzes the link between debt maturity and term spread. The third chapter analyzes the role of debt financing on skyscraper heights.

  20. Electronic technology

    International Nuclear Information System (INIS)

    Kim, Jin Su

    2010-07-01

    This book is composed of five chapters, which introduces electronic technology about understanding of electronic, electronic component, radio, electronic application, communication technology, semiconductor on its basic, free electron and hole, intrinsic semiconductor and semiconductor element, Diode such as PN junction diode, characteristic of junction diode, rectifier circuit and smoothing circuit, transistor on structure of transistor, characteristic of transistor and common emitter circuit, electronic application about electronic equipment, communication technology and education, robot technology and high electronic technology.

  1. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Sung Kee; Jung, U.; Park, H. R.

    2010-04-15

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  2. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    International Nuclear Information System (INIS)

    Jo, Sung Kee; Jung, U.; Park, H. R.

    2010-04-01

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  3. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  4. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  5. Development of the Chinshan plant analyzer and its assessment with plant data

    International Nuclear Information System (INIS)

    Shihjen Wang; Chunsheng Chien; Jungyuh Jang; Shawcuang Lee

    1993-01-01

    To apply fast and accurate simulation techniques to Taiwanese nuclear power plants, plant analyzer technology was transferred to Taiwan from the Brookhaven National Laboratory (BNL) through a cooperative program. The Chinshan plant analyzer is developed on the AD100 peripheral processor systems, based on the BNL boiling water reactor plant analyzer. The BNL plant analyzer was first converted from MPS10 programming for AD10 to ADSIM programming for AD100. It was then modified for the Taiwan Power Company's Chinshan power station. The simulation speed of the Chinshan plant analyzer is eight times faster than real time. A load rejection transient performed at 100% of full power during startup tests was simulated with the Chinshan plant analyzer, and the results were benchmarked against test data. The comparison shows good agreement between calculated results and test data

  6. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    Science.gov (United States)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  7. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  8. Comparative evaluation of Plateletworks, Multiplate analyzer and Platelet function analyzer-200 in cardiology patients.

    Science.gov (United States)

    Kim, Jeeyong; Cho, Chi Hyun; Jung, Bo Kyeung; Nam, Jeonghun; Seo, Hong Seog; Shin, Sehyun; Lim, Chae Seung

    2018-04-14

    The objective of this study was to comparatively evaluate three commercial whole-blood platelet function analyzer systems: Platelet Function Analyzer-200 (PFA; Siemens Canada, Mississauga, Ontario, Canada), Multiplate analyzer (MP; Roche Diagnostics International Ltd., Rotkreuz, Switzerland), and Plateletworks Combo-25 kit (PLW; Helena Laboratories, Beaumont, TX, USA). Venipuncture was performed on 160 patients who visited a department of cardiology. Pairwise agreement among the three platelet function assays was assessed using Cohen's kappa coefficient and percent agreement within the reference limit. Kappa values with the same agonists were poor between PFA-collagen (COL; agonist)/adenosine diphosphate (ADP) and MP-ADP (-0.147), PFA-COL/ADP and PLW-ADP (0.089), MP-ADP and PLW-ADP (0.039), PFA-COL/ADP and MP-COL (-0.039), and between PFA-COL/ADP and PLW-COL (-0.067). Nonetheless, kappa values for the same assay principle with a different agonist were slightly higher between PFA-COL/ADP and PFA-COL/EPI (0.352), MP-ADP and MP-COL (0.235), and between PLW-ADP and PLW-COL (0.247). The range of percent agreement values was 38.7% to 73.8%. Therefore, various measurements of platelet function by more than one method were needed to obtain a reliable interpretation of platelet function considering low kappa coefficient and modest percent agreement rates among 3 different platelet function tests.

  9. Design of multi-channel amplitude analyzer base on LonWorks

    International Nuclear Information System (INIS)

    Zhang Ying; Zhao Lihong; Chen Aihua

    2008-01-01

    The paper introduces the multi-channel analyzer which adopts LonWorks technology. The system detects the pulse peak by hardware circuits and controls data acquisition and network communication by Micro Controller and Unit and Neuron chip. SCM is programmed by Keil C51; the communication between SCM and nerve cell is realized by Neron C language, and the computer program is written by VB language. Test results show that this analyzer is with fast conversion speed and low power consumption. (authors)

  10. Analytical review of modern information education technologies

    OpenAIRE

    Светлана Викторовна Зенкина; О П Панкратова

    2014-01-01

    This article discusses and analyzes the modern information education technologies, which are seen as the priority to use in the modern information educational environment (Internet-based educational technologies, distance education, media education, e-Learning technologies, smart-education technologies).

  11. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  12. A Methodology for Capturing and Analyzing Data from Technology Base Seminar Wargames

    Science.gov (United States)

    1991-09-01

    LONG RANGE FIRES LTC Angus USAFAS Dr. Brown ARO Mr. Campi CECOM LTC De Broux USACACDA COL James ARDEC Mr. Konick HDL Mr. Lavoie AFAU MAJ Martin ARDEC...85.17 2 SUPPLY DISTRIBUTION 81.92 3 RELIABILITY 91.89 1 TAC SUP TRANSPORT 77.91 5 TAC POL TRANSPORT 78.66 4 137 TABLE 7. QUESTION 7. ASSESS THE...C) Logistica nobility ) ( ) ( ) ( ) () ( ) ( Industrial Bass C) ( ) () () C) () ( Soldier Training hSO () ( ) () ( ) ( ) ) ( Nmnpouranagm nt

  13. Analyzing the Effects of Technological Change: A Computable General Equilibrium Approach

    Science.gov (United States)

    1988-09-01

    present important simplifying assumptions about the nature of consumer preferences and production possibility sets. If a general equilibrium model...important assumptions are in such areas as consumer preferences , the actions of the government, and the financial structure of the model. Each of these is...back in the future. 4.3.2 Consumer demand Consumer preferences are a second important modeling assumption affecting the results of the study. The PILOT

  14. A Hybrid Method of Analyzing Patents for Sustainable Technology Management in Humanoid Robot Industry

    OpenAIRE

    Jongchan Kim; Joonhyuck Lee; Gabjo Kim; Sangsung Park; Dongsik Jang

    2016-01-01

    A humanoid, which refers to a robot that resembles a human body, imitates a human’s intelligence, behavior, sense, and interaction in order to provide various types of services to human beings. Humanoids have been studied and developed constantly in order to improve their performance. Humanoids were previously developed for simple repetitive or hard work that required significant human power. However, intelligent service robots have been developed actively these days to provide necessary info...

  15. Digital Media in Primary Schools: Literacy or Technology? Analyzing Government and Media Discourses

    Science.gov (United States)

    Pereira, Sara; Pereira, Luís

    2015-01-01

    This article examines the political and the media discourses concerning the Portuguese governmental program responsible for delivering a laptop named "Magalhães" to all primary school children. The analysis is based on the official documents related to the launch and development of the initiative as well as the press coverage of this…

  16. Point-of-care, portable microfluidic blood analyzer system

    Science.gov (United States)

    Maleki, Teimour; Fricke, Todd; Quesenberry, J. T.; Todd, Paul W.; Leary, James F.

    2012-03-01

    Recent advances in MEMS technology have provided an opportunity to develop microfluidic devices with enormous potential for portable, point-of-care, low-cost medical diagnostic tools. Hand-held flow cytometers will soon be used in disease diagnosis and monitoring. Despite much interest in miniaturizing commercially available cytometers, they remain costly, bulky, and require expert operation. In this article, we report progress on the development of a battery-powered handheld blood analyzer that will quickly and automatically process a drop of whole human blood by real-time, on-chip magnetic separation of white blood cells (WBCs), fluorescence analysis of labeled WBC subsets, and counting a reproducible fraction of the red blood cells (RBCs) by light scattering. The whole blood (WB) analyzer is composed of a micro-mixer, a special branching/separation system, an optical detection system, and electronic readout circuitry. A droplet of un-processed blood is mixed with the reagents, i.e. magnetic beads and fluorescent stain in the micro-mixer. Valve-less sorting is achieved by magnetic deflection of magnetic microparticle-labeled WBC. LED excitation in combination with an avalanche photodiode (APD) detection system is used for counting fluorescent WBC subsets using several colors of immune-Qdots, while counting a reproducible fraction of red blood cells (RBC) is performed using a laser light scatting measurement with a photodiode. Optimized branching/channel width is achieved using Comsol Multi-Physics™ simulation. To accommodate full portability, all required power supplies (40v, +/-10V, and +3V) are provided via step-up voltage converters from one battery. A simple onboard lock-in amplifier is used to increase the sensitivity/resolution of the pulse counting circuitry.

  17. Calibration of carbon analyzer LECO type IR-212

    International Nuclear Information System (INIS)

    Lilis Windaryati; Pranjono; Galuh Sri Banawa

    2013-01-01

    Calibration of Carbon Analyzer LECO type IR-212 has been done. The aim of this research is to study the performance of the carbon analyzer LECO type IR-212 for its accuracy assurance. The experiment includes a series of performance adjustment using standard material traceable nationally/internationally. The standard material used for the calibration is standard carbon manufactured by LECO, which refers to National Institute of Standards and Technology (NIST) Standard Reference Materials (SRM) of traceable certificate. The method used is based on Application Bulletin Leco Corporation. The composition used for the experiment varies from 0,0097% to 0,8110% that is 0,0097 ± 0,0014%; 0,0348 ± 0,0013%; 0,1770 ± 0,003% and 0,8110 ± 0,007%. The analysis results for those varied composition are 0,0097 ± 0,000175%; 0,03474 ± 0,000152%; 0,1762 ± 0,00228% and 0,80982 ± 0,000958% for their mean value and standard deviation respectively. In the standard analysis, the results are close to the true value is the measurement of a standard sample with a content of 0.811% with a correction factor of 1.0015. The smallest standard deviation in measurements of 0,0348% sample gives the lowest standard deviation, i.e. 0,000152. The analysis results are considered sufficiently stable with linear calibration curve of y = 0.9984 x with correlation coefficient R 2 = 1. (author)

  18. Multichannel analyzer for nuclear spectrometry with FPGA using Vivado

    International Nuclear Information System (INIS)

    Garcia D, A.; Hernandez D, V. M.; Vega C, H. R.; Ordaz G, O. O.; Ibarra D, S.; Bravo M, I.

    2017-09-01

    The different applications of ionizing radiation have made this a very significant and useful tool, in turn can be dangerous for living beings if they are exposed to uncontrolled doses. However, due to its characteristics, cannot be perceived by the five senses, in such a way that in order to know the presence of it, radiation detectors and additional devices are required that allow to quantify and classify it. This is the case of the multichannel analyzer that is responsible for separating the different pulse heights that are generated in the detectors, in a certain number of channels; according to the number of bits of the analog to digital converter. The development or conditioning of nuclear technology has increased considerably due to the demand of the applications; therefore this allows developing systems that cover some commercial requirements, cost and volume in relation to the user needs. The objective of the work was to design and implement an intellectual property core (IP Core) which functions as a multichannel analyzer for nuclear spectrometry. For the IP Core design methodology, its components were created in VHDL hardware description language and packaged in the Vivado design suite, making use of resources such as the ARM processor cores that the Zynq chip contains. Likewise, for the first phase of the implementation, the hardware architecture was embedded in the FPGA and the application for the ARM processor was programmed in C language. For the second phase, the management, control and visualization of the results, a virtual instrument was developed in the LabView programming platform. The data obtained as a result of the development and implementation of the IP Core was observed graphically in a histogram that is part of the aforementioned virtual instrument. (Author)

  19. Technology-Enhanced Discovery

    Science.gov (United States)

    Harrow, Chris; Chin, Lillian

    2014-01-01

    Exploration, innovation, proof: For students, teachers, and others who are curious, keeping an open mind and being ready to investigate unusual or unexpected properties will always lead to learning something new. Technology can further this process, allowing various behaviors to be analyzed that were previously memorized or poorly understood. This…

  20. The technological conception

    International Nuclear Information System (INIS)

    Parrochia, D.

    1998-01-01

    The 'technological conception' examines how a project can be concretized or how it is possible to 'conceive', i.e. to produce operative ideas that can be directly use. The first part of this book, called 'concepts and methods', analyzes the logics of conceiving and its philosophy in the construction of its objects and in the management of its programs or projects. The second part is devoted to some exemplary technologies: roads, tunnels, bridges, dams, nuclear power plants, aerospace constructions, and analyzes different concrete logics of technological conception. Finally, the author shows how todays conception faces the risks and complexity increase of systems and considers the possibility of an entirely automated manufacturing shop in the future. (J.S.)

  1. Casting Technology.

    Science.gov (United States)

    Wright, Michael D.; And Others

    1992-01-01

    Three articles discuss (1) casting technology as it relates to industry, with comparisons of shell casting, shell molding, and die casting; (2) evaporative pattern casting for metals; and (3) high technological casting with silicone rubber. (JOW)

  2. Living Technology

    DEFF Research Database (Denmark)

    2010-01-01

    This book is aimed at anyone who is interested in learning more about living technology, whether coming from business, the government, policy centers, academia, or anywhere else. Its purpose is to help people to learn what living technology is, what it might develop into, and how it might impact...... our lives. The phrase 'living technology' was coined to refer to technology that is alive as well as technology that is useful because it shares the fundamental properties of living systems. In particular, the invention of this phrase was called for to describe the trend of our technology becoming...... increasingly life-like or literally alive. Still, the phrase has different interpretations depending on how one views what life is. This book presents nineteen perspectives on living technology. Taken together, the interviews convey the collective wisdom on living technology's power and promise, as well as its...

  3. Technology transfer

    International Nuclear Information System (INIS)

    1998-01-01

    On the base of technological opportunities and of the environmental target of the various sectors of energy system this paper intend to conjugate the opportunity/objective with economic and social development through technology transfer and information dissemination [it

  4. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  5. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  6. Technology in Intersecting Figured Worlds.

    Science.gov (United States)

    Esbensen, Gertrud Lynge; Hasse, Cathrine

    2015-01-01

    In this chapter we analyze aspects of how Danish student nurses acquire technological literacy during their clinical internship at a Danish hospital. The argument is supported by several cases from Esbensen's empirical work. We focus on a Techno-Anthropological study of how student nurses learn to engage in technological mediated relations, and discuss how we think the ideas of intersecting worlds help to analyze some of the difficulties, student's experience.

  7. Millennial Filipino Student Engagement Analyzer Using Facial Feature Classification

    Science.gov (United States)

    Manseras, R.; Eugenio, F.; Palaoag, T.

    2018-03-01

    Millennials has been a word of mouth of everybody and a target market of various companies nowadays. In the Philippines, they comprise one third of the total population and most of them are still in school. Having a good education system is important for this generation to prepare them for better careers. And a good education system means having quality instruction as one of the input component indicators. In a classroom environment, teachers use facial features to measure the affect state of the class. Emerging technologies like Affective Computing is one of today’s trends to improve quality instruction delivery. This, together with computer vision, can be used in analyzing affect states of the students and improve quality instruction delivery. This paper proposed a system of classifying student engagement using facial features. Identifying affect state, specifically Millennial Filipino student engagement, is one of the main priorities of every educator and this directed the authors to develop a tool to assess engagement percentage. Multiple face detection framework using Face API was employed to detect as many student faces as possible to gauge current engagement percentage of the whole class. The binary classifier model using Support Vector Machine (SVM) was primarily set in the conceptual framework of this study. To achieve the most accuracy performance of this model, a comparison of SVM to two of the most widely used binary classifiers were tested. Results show that SVM bested RandomForest and Naive Bayesian algorithms in most of the experiments from the different test datasets.

  8. CSNI specialist meeting on simulators and plant analyzers

    International Nuclear Information System (INIS)

    Miettinen, J.; Holmstroem, H.

    1994-01-01

    The Specialist Meeting on Simulators and Plant Analyzers, held in June 9-12, 1992, in Lappeenranta, Finland, was sponsored by the Committee on the Safety on Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organized in collaboration with the Technical Research Centre of Finland (VTT) and the Lappeenranta Technical University of Technology (LTKK). All the presented papers were invited and devided into four sessions. In the first session the objectives, requirements and consepts of simulators were discussed against present standards and guidelines. The second session focused on the capabilities of current analytical models. The third session focused on the experiences gained so far from the applications. The final fourth session concentrated on simulators, which are currently under development, and future plans with regard to both development and utilization. At the end of the meeting topics of the meeting were discussed at the panel discussion. Summaries of the sessions and shortened version of the panel discussion are included into the proceeding. (orig.)

  9. Analyzing research trends on drug safety using topic modeling.

    Science.gov (United States)

    Zou, Chen

    2018-04-06

    Published drug safety data has evolved in the past decade due to scientific and technological advances in the relevant research fields. Considering that a vast amount of scientific literature has been published in this area, it is not easy to identify the key information. Topic modeling has emerged as a powerful tool to extract meaningful information from a large volume of unstructured texts. Areas covered: We analyzed the titles and abstracts of 4347 articles in four journals dedicated to drug safety from 2007 to 2016. We applied Latent Dirichlet allocation (LDA) model to extract 50 main topics, and conducted trend analysis to explore the temporal popularity of these topics over years. Expert Opinion/Commentary: We found that 'benefit-risk assessment and communication', 'diabetes' and 'biologic therapy for autoimmune diseases' are the top 3 most published topics. The topics relevant to the use of electronic health records/observational data for safety surveillance are becoming increasingly popular over time. Meanwhile, there is a slight decrease in research on signal detection based on spontaneous reporting, although spontaneous reporting still plays an important role in benefit-risk assessment. The topics related to medical conditions and treatment showed highly dynamic patterns over time.

  10. Earthing Technology

    NARCIS (Netherlands)

    Blok, Vincent

    2017-01-01

    In this article, we reflect on the conditions under which new technologies emerge in the Anthropocene and raise the question of how to conceptualize sustainable technologies therein. To this end, we explore an eco-centric approach to technology development, called biomimicry. We discuss opposing

  11. Technology Tiers

    DEFF Research Database (Denmark)

    Karlsson, Christer

    2015-01-01

    A technology tier is a level in a product system: final product, system, subsystem, component, or part. As a concept, it contrasts traditional “vertical” special technologies (for example, mechanics and electronics) and focuses “horizontal” feature technologies such as product characteristics...

  12. Analyzing Science Activities in Force and Motion Concepts: A Design of an Immersion Unit

    Science.gov (United States)

    Ayar, Mehmet C.; Aydeniz, Mehmet; Yalvac, Bugrahan

    2015-01-01

    In this paper, we analyze the science activities offered at 7th grade in the Turkish science and technology curriculum along with addressing the curriculum's original intent. We refer to several science education researchers' ideas, including Chinn & Malhotra's (Science Education, 86:175--218, 2002) theoretical framework and Edelson's (1998)…

  13. A Ca/Fe X-ray fluorescence analyzer suitable for the purpose of teaching

    International Nuclear Information System (INIS)

    Qu Guopu; Guo Lanying; Xu Shaoyi

    2003-01-01

    This paper introduces a Ca/Fe XRF analyzer specially designed for the purpose of teaching the related courses on nuclear engineering and nuclear technology in the university. Both working principle and constitution of the instrument are presented. A comparison between XRF analysis and chemical analysis showed that the two results were in agreement with an error of ±0.7%

  14. Co-Production of Knowledge in Multi-Stakeholder Processes: Analyzing Joint Experimentation as Social Learning

    Science.gov (United States)

    Akpo, Essegbemon; Crane, Todd A.; Vissoh, Pierre V.; Tossou, Rigobert C.

    2015-01-01

    Purpose: Changing research design and methodologies regarding how researchers articulate with end-users of technology is an important consideration in developing sustainable agricultural practices. This paper analyzes a joint experiment as a multi-stakeholder process and contributes to understand how the way of organizing social learning affects…

  15. Co-production of Knowledge in Multi-stakeholder Processes: Analyzing Joint Experimentation as Social Learning

    NARCIS (Netherlands)

    Akpo, E.; Crane, T.A.; Vissoh, P.; Tossou, C.R.

    2015-01-01

    Purpose: Changing research design and methodologies regarding how researchers articulate with end-users of technology is an important consideration in developing sustainable agricultural practices. This paper analyzes a joint experiment as a multi-stakeholder process and contributes to understand

  16. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...... deficits are important problems today...

  17. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2004-01-01

    Implementation of new computer-mediated communication (CMC) systems in organizations is a complex socio-technical endeavour, involving the mutual adaptation of technology and organization over time. Drawing on the analytic concept of sensemaking, this paper provides a theoretical perspective...... that deepens our understanding of how organizations appropriate new electronic communication media. The paper analyzes how a group of mediators in a large, multinational company adapted a new web-based CMC technology (a virtual workspace) to the local organizational context (and vice versa) by modifying...... features of the technology, providing ongoing support for users, and promoting appropriate conventions of use. We found that these mediators exerted considerable influence on how the technology was established and used in the organization. The mediators were not neutral facilitators of a well...

  18. Sensemaking technologies

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    Research scope: The scope of the project is to study technological implementation processes by using Weick's sensemaking concept (Weick, 1995). The purpose of using a social constructivist approach to investigate technological implementation processes is to find out how new technologies transform......, Orlikowski 2000). Viewing the use of technology as a process of enactment opens up for investigating the social processes of interpreting new technology into the organisation (Orlikowski 2000). The scope of the PhD project will therefore be to gain a deeper understanding of how the enactment of new...... & Brass, 1990; Kling 1991; Orlikowski 2000). It also demonstrates that technology is a flexible variable adapted to the organisation's needs, culture, climate and management philosophy, thus leading to different uses and outcomes of the same technology in different organisations (Barley 1986; 1990...

  19. Technology roadmaps

    Energy Technology Data Exchange (ETDEWEB)

    Pearson, B. [Natural Resources Canada, Ottawa, ON (Canada). CANMET Energy Technology Centre

    2003-07-01

    The purpose of a technology road map is to define the state of a current technology, relevant market issues, and future market needs; to develop a plan that industry can follow to provide these new products and services; and to map technology pathways and performance goals for bringing these products and services to market. The three stages (planning, implementation, and reviewing and updating), benefits, and status of the Clean Coal Technology Roadmap are outlined. Action Plan 2000, a $1.7 million 2000 Climate Change Technology and Innovation Program, which uses the technology roadmapping process, is described. The members of the management steering committee for the Clean Coal Technology Roadmap are listed. A flowsheet showing activities until November 2004, when the final clean coal road map is due, is included.

  20. Appropriate Technology as Indian Technology.

    Science.gov (United States)

    Barry, Tom

    1979-01-01

    Describes the mounting enthusiasm of Indian communities for appropriate technology as an inexpensive means of providing much needed energy and job opportunities. Describes the development of several appropriate technology projects, and the goals and activities of groups involved in utilizing low scale solar technology for economic development on…

  1. Technology '90

    International Nuclear Information System (INIS)

    1991-01-01

    The US Department of Energy (DOE) laboratories have a long history of excellence in performing research and development in a number of areas, including the basic sciences, applied-energy technology, and weapons-related technology. Although technology transfer has always been an element of DOE and laboratory activities, it has received increasing emphasis in recent years as US industrial competitiveness has eroded and efforts have increased to better utilize the research and development resources the laboratories provide. This document, Technology '90, is the latest in a series that is intended to communicate some of the many opportunities available for US industry and universities to work with the DOE and its laboratories in the vital activity of improving technology transfer to meet national needs. Technology '90 is divided into three sections: Overview, Technologies, and Laboratories. The Overview section describes the activities and accomplishments of the DOE research and development program offices. The Technologies section provides descriptions of new technologies developed at the DOE laboratories. The Laboratories section presents information on the missions, programs, and facilities of each laboratory, along with a name and telephone number of a technology transfer contact for additional information. Separate papers were prepared for appropriate sections of this report

  2. Analyzing the influence of ‘knowledge technologies’ in transport planning

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik

    , but models also share with other knowledge technologies the fate of being sometimes used politically, rather than analytically, perhaps even distortively rather than supportively, or not being used at all. The ‘actual’ influence that transport models and other tools exert over planning processes and outcomes...... to characterize and analyze the use and influence of ‘knowledge technologies’ in the transport sector. The paper is not primarily about models per se or particular model applications, but will try to situate simulation models in a wider ‘use and influence’ landscape of transport planning knowledge technologies...... of the paper is thus to seek is to systematize previous research on ‘knowledge use’, discuss its applicability to context of transport policy and planning technologies, and critically reflect on ways to research the use and influence ‘pathways’ of knowledge technologies - such as models - in this sector. • One...

  3. Soulful Technologies

    DEFF Research Database (Denmark)

    Fausing, Bent

    2010-01-01

    Samsung introduced in 2008 a mobile phone called "Soul" made with a human touch and including itself a "magic touch". Through the analysis of a Nokia mobile phone TV-commercials I want to examine the function and form of digital technology in everyday images. The mobile phone and its digital camera...... and other devices are depicted by everyday aesthetics as capable of producing a unique human presence and interaction. The medium, the technology is a necessary helper of this very special and lost humanity. Without the technology, no special humanity, no soul - such is the prophecy. This personification...... or anthropomorphism is important for the branding of new technology. Technology is seen as creating a techno-transcendence towards a more qualified humanity which is in contact with fundamental human values like intuition, vision, and sensing; all the qualities that technology, industrialization, and rationalization...

  4. Globalization & technology

    DEFF Research Database (Denmark)

    Narula, Rajneesh

    Technology and globalization are interdependent processes. Globalization has a fundamental influence on the creation and diffusion of technology, which, in turn, affects the interdependence of firms and locations. This volume examines the international aspect of this interdependence at two levels...... of innovation" understanding of learning. Narula and Smith reconcile an important paradox. On the one hand, locations and firms are increasingly interdependent through supranational organisations, regional integration, strategic alliances, and the flow of investments, technologies, ideas and people...

  5. Army Technology

    Science.gov (United States)

    2015-02-01

    that allows them to perform applied research under the Institute for Biotechnology research team 1 2 3 20 | ARMY TECHNOLOGY MAGAZINE ...DASA(R&T) Deputy Assistant Secretary of the Army for Research and Technology Download the magazine , view online or read each individual story with...Army photo by Conrad Johnson) Front and back cover designs by Joe Stephens EXECUTIVE DEPUTY TO THE COMMANDING GENERAL Army Technology Magazine is an

  6. Technology alliances

    International Nuclear Information System (INIS)

    Torgerson, D.F.; Boczar, P.G.; Kugler, G.

    1991-10-01

    In the field of nuclear technology, Canada and Korea developed a highly successful relationship that could serve as a model for other high-technology industries. This is particularly significant when one considers the complexity and technical depth required to design, build and operate a nuclear reactor. This paper will outline the overall framework for technology transfer and cooperation between Canada and Korea, and will focus on cooperation in nuclear R and D between the two countries

  7. Technological risks

    International Nuclear Information System (INIS)

    Klinke, A.; Renn, O.

    1998-01-01

    The empirical part about the technological risks deals with different technologies: nuclear energy, early warning systems of nuclear weapons and NBC-weapons, and electromagnetic fields. The potential of damage, the contemporary management strategies and the relevant characteristics will be described for each technology: risks of nuclear energy; risks of early warning systems of nuclear weapons and NBC-weapons; risks of electromagnetic fields. (authors)

  8. Technological risks

    Energy Technology Data Exchange (ETDEWEB)

    Klinke, A.; Renn, O. [Center of Technology Assessment in Baden-Wuerttemberg, Stuttgart (Germany)

    1998-07-01

    The empirical part about the technological risks deals with different technologies: nuclear energy, early warning systems of nuclear weapons and NBC-weapons, and electromagnetic fields. The potential of damage, the contemporary management strategies and the relevant characteristics will be described for each technology: risks of nuclear energy; risks of early warning systems of nuclear weapons and NBC-weapons; risks of electromagnetic fields. (authors)

  9. Semantics based approach for analyzing disease-target associations.

    Science.gov (United States)

    Kaalia, Rama; Ghosh, Indira

    2016-08-01

    A complex disease is caused by heterogeneous biological interactions between genes and their products along with the influence of environmental factors. There have been many attempts for understanding the cause of these diseases using experimental, statistical and computational methods. In the present work the objective is to address the challenge of representation and integration of information from heterogeneous biomedical aspects of a complex disease using semantics based approach. Semantic web technology is used to design Disease Association Ontology (DAO-db) for representation and integration of disease associated information with diabetes as the case study. The functional associations of disease genes are integrated using RDF graphs of DAO-db. Three semantic web based scoring algorithms (PageRank, HITS (Hyperlink Induced Topic Search) and HITS with semantic weights) are used to score the gene nodes on the basis of their functional interactions in the graph. Disease Association Ontology for Diabetes (DAO-db) provides a standard ontology-driven platform for describing genes, proteins, pathways involved in diabetes and for integrating functional associations from various interaction levels (gene-disease, gene-pathway, gene-function, gene-cellular component and protein-protein interactions). An automatic instance loader module is also developed in present work that helps in adding instances to DAO-db on a large scale. Our ontology provides a framework for querying and analyzing the disease associated information in the form of RDF graphs. The above developed methodology is used to predict novel potential targets involved in diabetes disease from the long list of loose (statistically associated) gene-disease associations. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Negative binomial mixed models for analyzing microbiome count data.

    Science.gov (United States)

    Zhang, Xinyan; Mallick, Himel; Tang, Zaixiang; Zhang, Lei; Cui, Xiangqin; Benson, Andrew K; Yi, Nengjun

    2017-01-03

    Recent advances in next-generation sequencing (NGS) technology enable researchers to collect a large volume of metagenomic sequencing data. These data provide valuable resources for investigating interactions between the microbiome and host environmental/clinical factors. In addition to the well-known properties of microbiome count measurements, for example, varied total sequence reads across samples, over-dispersion and zero-inflation, microbiome studies usually collect samples with hierarchical structures, which introduce correlation among the samples and thus further complicate the analysis and interpretation of microbiome count data. In this article, we propose negative binomial mixed models (NBMMs) for detecting the association between the microbiome and host environmental/clinical factors for correlated microbiome count data. Although having not dealt with zero-inflation, the proposed mixed-effects models account for correlation among the samples by incorporating random effects into the commonly used fixed-effects negative binomial model, and can efficiently handle over-dispersion and varying total reads. We have developed a flexible and efficient IWLS (Iterative Weighted Least Squares) algorithm to fit the proposed NBMMs by taking advantage of the standard procedure for fitting the linear mixed models. We evaluate and demonstrate the proposed method via extensive simulation studies and the application to mouse gut microbiome data. The results show that the proposed method has desirable properties and outperform the previously used methods in terms of both empirical power and Type I error. The method has been incorporated into the freely available R package BhGLM ( http://www.ssg.uab.edu/bhglm/ and http://github.com/abbyyan3/BhGLM ), providing a useful tool for analyzing microbiome data.

  11. Niobium technological alternatives

    International Nuclear Information System (INIS)

    Pinatti, D.G.; Dainesi, C.R.

    1981-01-01

    The process-product matrix of Niobium is presented, through which the technological alternatives for Niobium are identified. It is shown that the three axes of Niobium application, steels, superalloys and metallic Niobium have a tendency to be economical by equivalent. The critical points where technological development of Niobium is needed are analyzed and results are presented on the following products: Nb 2 O 5 by volatilization, metalic Niobium, Niobium powder, bars and sheets, NbTi alloy, corrosion resistent Niobium alloys and superconductor cable and wires. (Author) [pt

  12. Chemistry Technology

    Data.gov (United States)

    Federal Laboratory Consortium — Chemistry technology experts at NCATS engage in a variety of innovative translational research activities, including:Design of bioactive small molecules.Development...

  13. Technology Catalogue

    International Nuclear Information System (INIS)

    1994-02-01

    The Department of Energy's Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM's Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department's clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD's applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina)

  14. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  15. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  16. Thermally activated technologies: Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2003-05-01

    The purpose of this Technology Roadmap is to outline a set of actions for government and industry to develop thermally activated technologies for converting America’s wasted heat resources into a reservoir of pollution-free energy for electric power, heating, cooling, refrigeration, and humidity control. Fuel flexibility is important. The actions also cover thermally activated technologies that use fossil fuels, biomass, and ultimately hydrogen, along with waste heat.

  17. Technology Exhibition

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1979-09-15

    Linked to the 25th Anniversary celebrations, an exhibition of some of CERN's technological achievements was opened on 22 June. Set up in a new 600 m{sup 2} Exhibition Hall on the CERN site, the exhibition is divided into eight technology areas — magnets, vacuum, computers and data handling, survey and alignment, radiation protection, beam monitoring and handling, detectors, and workshop techniques.

  18. Radiation Technology

    International Nuclear Information System (INIS)

    1990-01-01

    The conference was organized to evaluate the application directions of radiation technology in Vietnam and to utilize the Irradiation Centre in Hanoi with the Co-60 source of 110 kCi. The investigation and study of technico-economic feasibility for technology development to various items of food and non-food objects was reported. (N.H.A)

  19. Technology Transformation

    Science.gov (United States)

    Scott, Heather; McGilll, Toria

    2011-01-01

    Social networking and other technologies, if used judiciously, present the means to integrate 21st century skills into the classroom curriculum. But they also introduce challenges that educators must overcome. Increased concerns about plagiarism and access to technology can test educators' creativity and school resources. Air Academy High School,…

  20. Maritime Technology

    DEFF Research Database (Denmark)

    Sørensen, Herman

    1997-01-01

    Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text.......Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text....

  1. A Study on the Nuclear Technology Policy

    International Nuclear Information System (INIS)

    Oh, K. B.; Lee, K. S.; Chung, W. S.; Lee, T. J.; Yun, S. W.; Jeong, I.; Lee, J. H.

    2007-02-01

    The objective of the study was to make policy-proposals for enhancing the effectiveness and efficiency of national nuclear technology R and D programs. To do this, environmental changes of international nuclear energy policy and trends of nuclear technology development were surveyed and analyzed. This Study analyzed trends of nuclear technology policies and developed the nuclear energy R and D innovation strategy in a viewpoint of analyzing the changes in the global policy environment associated with nuclear technology development and development of national nuclear R and D strategy

  2. 21 CFR 870.3640 - Indirect pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indirect pacemaker generator function analyzer... Indirect pacemaker generator function analyzer. (a) Identification. An indirect pacemaker generator function analyzer is an electrically powered device that is used to determine pacemaker function or...

  3. 21 CFR 870.3630 - Pacemaker generator function analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Pacemaker generator function analyzer. 870.3630... (CONTINUED) MEDICAL DEVICES CARDIOVASCULAR DEVICES Cardiovascular Prosthetic Devices § 870.3630 Pacemaker generator function analyzer. (a) Identification. A pacemaker generator function analyzer is a device that is...

  4. 21 CFR 868.1700 - Nitrous oxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitrous oxide gas analyzer. 868.1700 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1700 Nitrous oxide gas analyzer. (a) Identification. A nitrous oxide gas analyzer is a device intended to measure the concentration of nitrous oxide...

  5. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement of...

  6. Hardware Realization of an Ethernet Packet Analyzer Search Engine

    Science.gov (United States)

    2000-06-30

    specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The

  7. 40 CFR 86.1322-84 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... be used. (2) Zero the carbon monoxide analyzer with either zero-grade air or zero-grade nitrogen. (3... columns is one form of corrective action which may be taken.) (b) Initial and periodic calibration. Prior... calibrated. (1) Adjust the analyzer to optimize performance. (2) Zero the carbon monoxide analyzer with...

  8. Sensemaking technology

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    Research objective: The object of the LOK research project is to gain a better understanding of the technological strategic processes in organisations by using the concept/metaphor of sensemaking. The project will investigate the technological strategies in organisations in order to gain a deeper...... understanding of the cognitive competencies and barriers towards implementing new technology in organisations. The research will therefore concentrate on researching the development process in the organisation's perception of the external environmental elements of customers, suppliers, competitors, internal...... and external technology and legislation and the internal environmental elements of structure, power relations and political arenas. All of these variables have influence on which/how technologies are implemented thus creating different outcomes all depending on the social dynamics that are triggered by changes...

  9. Skyline: an open source document editor for creating and analyzing targeted proteomics experiments.

    Science.gov (United States)

    MacLean, Brendan; Tomazela, Daniela M; Shulman, Nicholas; Chambers, Matthew; Finney, Gregory L; Frewen, Barbara; Kern, Randall; Tabb, David L; Liebler, Daniel C; MacCoss, Michael J

    2010-04-01

    Skyline is a Windows client application for targeted proteomics method creation and quantitative data analysis. It is open source and freely available for academic and commercial use. The Skyline user interface simplifies the development of mass spectrometer methods and the analysis of data from targeted proteomics experiments performed using selected reaction monitoring (SRM). Skyline supports using and creating MS/MS spectral libraries from a wide variety of sources to choose SRM filters and verify results based on previously observed ion trap data. Skyline exports transition lists to and imports the native output files from Agilent, Applied Biosystems, Thermo Fisher Scientific and Waters triple quadrupole instruments, seamlessly connecting mass spectrometer output back to the experimental design document. The fast and compact Skyline file format is easily shared, even for experiments requiring many sample injections. A rich array of graphs displays results and provides powerful tools for inspecting data integrity as data are acquired, helping instrument operators to identify problems early. The Skyline dynamic report designer exports tabular data from the Skyline document model for in-depth analysis with common statistical tools. Single-click, self-updating web installation is available at http://proteome.gs.washington.edu/software/skyline. This web site also provides access to instructional videos, a support board, an issues list and a link to the source code project.

  10. A Study on the Revitalizing of technology commercialization in KAERI

    International Nuclear Information System (INIS)

    Choi, J. I.; Jang, S. K.; Hong, G. P.; Lee, E. S.

    2009-02-01

    The TEC training program should be implemented for researches who want to commercialize their own technologies. To build creative organization culture is essential for technology commercialization. Collaboration strategy is related to analyze how KAERI is catching up their technological capabilities in nuclear technology, and what the success factors of KAERI in technology commercialization are.

  11. A Case Study of Technology Choices by High School Students

    Science.gov (United States)

    Owens-Hartman, Amy R.

    2015-01-01

    The purpose of this case study was to examine student technology choices when given the freedom to choose technology devices to complete a project-based learning activity in a content area of study. The study also analyzed factors affecting technology choice as well as how technology proficiency scores aligned to technology choices. Patterns and…

  12. The moral relevance of technological artifacts

    NARCIS (Netherlands)

    Verbeek, Peter P.C.C.; Sollie, P.; Düwell, M.

    2009-01-01

    This chapter explores the ethics of technology in a double sense: it lays bare points of application for ethical reflection about technology development, and it analyzes the ethical dimensions of technology itself. First, the chapter addresses the question of how to conceptualize and assess the

  13. Technology and skill demand in Mexico

    OpenAIRE

    Lopez-Acevedo, Gladys

    2002-01-01

    The author investigates the effects of technology on the employment and wages of differently skilled Mexican manufacturing workers using firm panel data from 1992-99. She analyzes the relationship between technology and skill demand. Findings support the skill-biased technical change hypothesis. She then examines the temporal relationship of technology adoption to firm productivity and wor...

  14. Real-time analytics techniques to analyze and visualize streaming data

    CERN Document Server

    Ellis, Byron

    2014-01-01

    Construct a robust end-to-end solution for analyzing and visualizing streaming data Real-time analytics is the hottest topic in data analytics today. In Real-Time Analytics: Techniques to Analyze and Visualize Streaming Data, expert Byron Ellis teaches data analysts technologies to build an effective real-time analytics platform. This platform can then be used to make sense of the constantly changing data that is beginning to outpace traditional batch-based analysis platforms. The author is among a very few leading experts in the field. He has a prestigious background in research, development,

  15. Ergonomics technology

    Science.gov (United States)

    Jones, W. L.

    1977-01-01

    Major areas of research and development in ergonomics technology for space environments are discussed. Attention is given to possible applications of the technology developed by NASA in industrial settings. A group of mass spectrometers for gas analysis capable of fully automatic operation has been developed for atmosphere control on spacecraft; a version for industrial use has been constructed. Advances have been made in personal cooling technology, remote monitoring of medical information, and aerosol particle control. Experience gained by NASA during the design and development of portable life support units has recently been applied to improve breathing equipment used by fire fighters.

  16. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    International Nuclear Information System (INIS)

    Daily, Jeffrey A.

    2015-01-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore's law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or 'homologous') on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K

  17. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  18. Using the UTAUT Model to Analyze Students' ICT Adoption

    Science.gov (United States)

    Attuquayefio, Samuel NiiBoi; Addo, Hillar

    2014-01-01

    This paper seeks to provide further understanding of issues surrounding acceptance of information and communication technology (ICT) by students of tertiary institutions. The Unified Theory of Acceptance and Use of Technology (UTAUT) model Venkatesh et al (2003) was employed by the researchers to determine the strength of predictors for students'…

  19. Technology Innovation

    Science.gov (United States)

    EPA produces innovative technologies and facilitates their creation in line with the Agency mission to create products such as the stormwater calculator, remote sensing, innovation clusters, and low-cost air sensors.

  20. Technology | FNLCR

    Science.gov (United States)

    The Frederick National Laboratory develops and applies advanced, next-generation technologies to solve basic and applied problems in the biomedical sciences, and serves as a national resource of shared high-tech facilities.

  1. Plasma technology

    International Nuclear Information System (INIS)

    Drouet, M.G.

    1984-03-01

    IREQ was contracted by the Canadian Electrical Association to review plasma technology and assess the potential for application of this technology in Canada. A team of experts in the various aspects of this technology was assembled and each team member was asked to contribute to this report on the applications of plasma pertinent to his or her particular field of expertise. The following areas were examined in detail: iron, steel and strategic-metals production; surface treatment by spraying; welding and cutting; chemical processing; drying; and low-temperature treatment. A large market for the penetration of electricity has been identified. To build up confidence in the technology, support should be provided for selected R and D projects, plasma torch demonstrations at full power, and large-scale plasma process testing

  2. Exploration technology

    Energy Technology Data Exchange (ETDEWEB)

    Roennevik, H.C. [Saga Petroleum A/S, Forus (Norway)

    1996-12-31

    The paper evaluates exploration technology. Topics discussed are: Visions; the subsurface challenge; the creative tension; the exploration process; seismic; geology; organic geochemistry; seismic resolution; integration; drilling; value creation. 4 refs., 22 figs.

  3. Analyzing Real-World Light Duty Vehicle Efficiency Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, Jeffrey; Wood, Eric; Chaney, Larry; Holden, Jacob; Jeffers, Matthew; Wang, Lijuan

    2016-06-08

    Off-cycle technologies represent an important pathway to achieve real-world fuel savings, through which OEMs can potentially receive credit toward CAFE compliance. DOE national labs such as NREL are well positioned to provide objective input on these technologies using large, national data sets in conjunction with OEM- and technology-specific testing. This project demonstrates an approach that combines vehicle testing (dynamometer and on-road) with powertrain modeling and simulation over large, representative datasets to quantify real-world fuel economy. The approach can be applied to specific off-cycle technologies (engine encapsulation, start/stop, connected vehicle, etc.) in A/B comparisons to support calculation of realistic real-world impacts. Future work will focus on testing-based A/B technology comparisons that demonstrate the significance of this approach.

  4. Development of a Microfluidic Platform to Analyze Evolution of Programmed Bacterial Death

    Science.gov (United States)

    2015-12-20

    droplet-based microfluidic technology to generate population ‘bottleneck’. This platform will serve as a critical foundation for our long-term goal to...Final Report: Development of a Microfluidic Platform to Analyze Evolution of Programmed Bacterial Death The views, opinions and/or findings contained...Triangle Park, NC 27709-2211 Microfluidics , systems biology REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S ACRONYM

  5. Analyzing internal barriers for automotive parts remanufacturers in China using grey-DEMATEL approach

    DEFF Research Database (Denmark)

    Xia, Xiqiang; Govindan, Kannan; Zhu, Qinghua

    2015-01-01

    analyzes internal barriers met by automotive parts remanufacturers and evaluates causal barriers by a proposed model framework. This objective is illustrated by employing the Grey Decision Making Trial and Evaluation Laboratory (DEMATEL) approach. By virtue of these findings, remanufacturers can eradicate......Automotive industries have attracted attention from international sectors recently. This attention to the industry results in many innovative technologies being integrated in these manufacturing arenas. In developing countries such as the BRIC (Brazil, Russia, India, and China) countries...

  6. Technological risk

    Energy Technology Data Exchange (ETDEWEB)

    Dierkes, M; Coppock, R; Edwards, S

    1980-01-01

    The book begins with brief statements from representatives of political organizations. Part II presents an overview of the discussion about the control and management of technological progress. Parts III and IV discuss important elements in citizens' perception of technological risks and the development of consensus on how to deal with them. In Part V practical problems in the application of risk assessment and management, and in Part VI additional points are summarized.

  7. Lasers technology

    International Nuclear Information System (INIS)

    2014-01-01

    The Laser Technology Program of IPEN is developed by the Center for Lasers and Applications (CLA) and is committed to the development of new lasers based on the research of new optical materials and new resonator technologies. Laser applications and research occur within several areas such as Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. Additional goals of the Program are human resource development and innovation, in association with Brazilian Universities and commercial partners

  8. Technological risk

    International Nuclear Information System (INIS)

    Dierkes, M.; Coppock, R.; Edwards, S.

    1980-01-01

    The book begins with brief statements from representatives of political organizations. Part II presents an overview of the discussion about the control and management of technological progress. Parts III and IV discuss important elements in citizens' perception of technological risks and the development of consensus on how to deal with them. In Part V practical problems in the application of risk assessment and management, and in Part VI additional points are summarized. (DG)

  9. Cognitive technologies

    CERN Document Server

    Mello, Alan; Figueiredo, Fabrício; Figueiredo, Rafael

    2017-01-01

    This book focuses on the next generation optical networks as well as mobile communication technologies. The reader will find chapters on Cognitive Optical Network, 5G Cognitive Wireless, LTE, Data Analysis and Natural Language Processing. It also presents a comprehensive view of the enhancements and requirements foreseen for Machine Type Communication. Moreover, some data analysis techniques and Brazilian Portuguese natural language processing technologies are also described here. .

  10. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  11. Development of a Telemetric, Miniaturized Electrochemical Amperometric Analyzer

    OpenAIRE

    Jaehyo Jung; Jihoon Lee; Siho Shin; Youn Tae Kim

    2017-01-01

    In this research, we developed a portable, three-electrode electrochemical amperometric analyzer that can transmit data to a PC or a tablet via Bluetooth communication. We performed experiments using an indium tin oxide (ITO) glass electrode to confirm the performance and reliability of the analyzer. The proposed analyzer uses a current-to-voltage (I/V) converter to convert the current generated by the reduction-oxidation (redox) reaction of the buffer solution to a voltage signal. This signa...

  12. Faraday cup for analyzing multi-ion plasma

    International Nuclear Information System (INIS)

    Fujita, Takao

    1987-01-01

    A compact and convenient ion analyzer (a kind of a Faraday cup) is developed in order to analyze weakly ionized multi-ion plasmas. This Faraday cup consists of three mesh electrodes and a movable ion collector. With a negative gate pulse superimposed on the ion retarding bias, ions are analyzed by means of time-of-flight. The identification of ion species and measurements of ion density and ion temperature are studied. (author)

  13. L G-2 Scintrex manual.Fluorescence analyzer

    International Nuclear Information System (INIS)

    Pirelli, H.

    1987-01-01

    The Scintrex Fluorescence Analyzer LG-2 selectively detects the presence of certain fluorescent minerals through UV photoluminescence induced and provides quantitative information on its distribution.

  14. Biomedical sensing analyzer (BSA) for mobile-health (mHealth)-LTE.

    Science.gov (United States)

    Adibi, Sasan

    2014-01-01

    The rapid expansion of mobile-based systems, the capabilities of smartphone devices, as well as the radio access and cellular network technologies are the wind beneath the wing of mobile health (mHealth). In this paper, the concept of biomedical sensing analyzer (BSA) is presented, which is a novel framework, devised for sensor-based mHealth applications. The BSA is capable of formulating the Quality of Service (QoS) measurements in an end-to-end sense, covering the entire communication path (wearable sensors, link-technology, smartphone, cell-towers, mobile-cloud, and the end-users). The characterization and formulation of BSA depend on a number of factors, including the deployment of application-specific biomedical sensors, generic link-technologies, collection, aggregation, and prioritization of mHealth data, cellular network based on the Long-Term Evolution (LTE) access technology, and extensive multidimensional delay analyses. The results are studied and analyzed in a LabView 8.5 programming environment.

  15. Technology cycles and technology revolutions

    Energy Technology Data Exchange (ETDEWEB)

    Paganetto, Luigi; Scandizzo, Pasquale Lucio

    2010-09-15

    Technological cycles have been characterized as the basis of long and continuous periods economic growth through sustained changes in total factor productivity. While this hypothesis is in part consistent with several theories of growth, the sheer magnitude and length of the economic revolutions experienced by humankind seems to indicate surmise that more attention should be given to the origin of major technological and economic changes, with reference to one crucial question: role of production and use of energy in economic development.

  16. Polysomnographic Technology

    Science.gov (United States)

    ... problems during sleep and helping individuals develop good sleep habits. Job Description The technologist gathers and analyzes patient ... curriculum of an accredited program focuses on correct performance of ... of sleep disorders. Through lecture and observation they gain experience ...

  17. 40 CFR 91.314 - Analyzer accuracy and specifications.

    Science.gov (United States)

    2010-07-01

    .... (3) Zero drift. The analyzer zero-response drift during a one-hour period must be less than two percent of full-scale chart deflection on the lowest range used. The zero-response is defined as the mean... calibration or span gas. (2) Noise. The analyzer peak-to-peak response to zero and calibration or span gases...

  18. A data mining approach to analyze occupant behavior motivation

    NARCIS (Netherlands)

    Ren, X.; Zhao, Y.; Zeiler, W.; Boxem, G.; Li, T.

    2017-01-01

    Occupants' behavior could bring significant impact on the performance of built environment. Methods of analyzing people's behavior have not been adequately developed. The traditional methods such as survey or interview are not efficient. This study proposed a data-driven method to analyze the

  19. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... any flow rate into the reaction chamber. This includes, but is not limited to, sample capillary, ozone... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new...

  20. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  1. Control of a pulse height analyzer using an RDX workstation

    International Nuclear Information System (INIS)

    Montelongo, S.; Hunt, D.N.

    1984-12-01

    The Nuclear Chemistry Division of Lawrence Livermore National laboratory is in the midst of upgrading its radiation counting facilities to automate data acquisition and quality control. This upgrade requires control of a pulse height analyzer (PHA) from an interactive LSI-11/23 workstation running RSX-11M. The PHA is a micro-computer based multichannel analyzer system providing data acquisition, storage, display, manipulation and input/output from up to four independent acquisition interfaces. Control of the analyzer includes reading and writing energy spectra, issuing commands, and servicing device interrupts. The analyzer communicates to the host system over a 9600-baud serial line using the Digital Data Communications link level Protocol (DDCMP). We relieved the RSX workstation CPU from the DDCMP overhead by implementing a DEC compatible in-house designed DMA serial line board (the ISL-11) to communicate with the analyzer. An RSX I/O device driver was written to complete the path between the analyzer and the RSX system by providing the link between the communication board and an application task. The I/O driver is written to handle several ISL-11 cards all operating in parallel thus providing support for control of multiple analyzers from a single workstation. The RSX device driver, its design and use by application code controlling the analyzer, and its operating environment will be discussed

  2. Analyzing FCS Professionals in Higher Education: A Case Study

    Science.gov (United States)

    Hall, Scott S.; Harden, Amy; Pucciarelli, Deanna L.

    2016-01-01

    A national study of family and consumer sciences (FCS) professionals in higher education was analyzed as a case study to illustrate procedures useful for investigating issues related to FCS. The authors analyzed response rates of more than 1,900 FCS faculty and administrators by comparing those invited to participate and the 345 individuals who…

  3. A multichannel analyzer computer system for simultaneously measuring 64 spectra

    International Nuclear Information System (INIS)

    Jin Yuheng; Wan Yuqing; Zhang Jiahong; Li Li; Chen Guozhu

    2000-01-01

    The author introduces a multichannel analyzer computer system for simultaneously measuring 64 spectra with 64 coded independent inputs. The system is developed for a double chopper neutron scattering time-of-flight spectrometer. The system structure, coding method, operating principle and performances are presented. The system can also be used for other nuclear physics experiments which need multichannel analyzer with independent coded inputs

  4. A Morphological Analyzer for Vocalized or Not Vocalized Arabic Language

    Science.gov (United States)

    El Amine Abderrahim, Med; Breksi Reguig, Fethi

    This research has been to show the realization of a morphological analyzer of the Arabic language (vocalized or not vocalized). This analyzer is based upon our object model for the Arabic Natural Language Processing (NLP) and can be exploited by NLP applications such as translation machine, orthographical correction and the search for information.

  5. Analyzing Population Genetics Data: A Comparison of the Software

    Science.gov (United States)

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  6. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 90.318 Section 90.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the...

  7. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 91.318 Section 91.318 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides of...

  8. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer calibration. 89.321 Section 89.321 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The chemiluminescent...

  9. Optimally analyzing and implementing of bolt fittings in steel structure based on ANSYS

    Science.gov (United States)

    Han, Na; Song, Shuangyang; Cui, Yan; Wu, Yongchun

    2018-03-01

    ANSYS simulation software for its excellent performance become outstanding one in Computer-aided Engineering (CAE) family, it is committed to the innovation of engineering simulation to help users to shorten the design process. First, a typical procedure to implement CAE was design. The framework of structural numerical analysis on ANSYS Technology was proposed. Then, A optimally analyzing and implementing of bolt fittings in beam-column join of steel structure was implemented by ANSYS, which was display the cloud chart of XY-shear stress, the cloud chart of YZ-shear stress and the cloud chart of Y component of stress. Finally, ANSYS software simulating results was compared with the measured results by the experiment. The result of ANSYS simulating and analyzing is reliable, efficient and optical. In above process, a structural performance's numerical simulating and analyzing model were explored for engineering enterprises' practice.

  10. THE EXPERIENCE OF COMPARISON OF STATIC SECURITY CODE ANALYZERS

    Directory of Open Access Journals (Sweden)

    Alexey Markov

    2015-09-01

    Full Text Available This work presents a methodological approach to comparison of static security code analyzers. It substantiates the comparison of the static analyzers as to efficiency and functionality indicators, which are stipulated in the international regulatory documents. The test data for assessment of static analyzers efficiency is represented by synthetic sets of open-source software, which contain vulnerabilities. We substantiated certain criteria for quality assessment of the static security code analyzers subject to standards NIST SP 500-268 and SATEC. We carried out experiments that allowed us to assess a number of the Russian proprietary software tools and open-source tools. We came to the conclusion that it is of paramount importance to develop Russian regulatory framework for testing software security (firstly, for controlling undocumented features and evaluating the quality of static security code analyzers.

  11. Radiation technology and feed production

    International Nuclear Information System (INIS)

    Ershov, B.G.

    1986-01-01

    The use of radiation technology to prepare feeds and feed additions for cattle of non-feed vegetable blends is considered.Physicochemical foundations of radiation-chemical processes, possibilities of the use of various radiation devices are given. Data on practical realization of the technology are presented and prospects of its introduction to solve the tasks put forward by the USSR program on feed production are analyzed

  12. The new technology of dividing wall in stope

    International Nuclear Information System (INIS)

    Li Zhiguo

    1999-01-01

    The author analyzes advantages and disadvantages of the ordinary separating methods between stopes at deep mine, points out the main problems of the original projects constructing dividing wall, presents an idea of new technology which can overcome the main problems, predicts effect adopting the new technology, and analyzes that feasibility applying the new technology to construct mortar pad in stope and dyke-dam

  13. Modeling of 1-D Nanowires and analyzing their Hydrogen and ...

    Indian Academy of Sciences (India)

    SUDIP PAN

    aDepartment of Chemistry and Center for Theoretical Studies, Indian Institute of Technology Kharagpur, ... is due to the electron deficient nature of boron. In the ... molecular form.16,17 Numerous varieties of molecular ..... Matter Mater. Phys.

  14. Poster session in instructional technology course

    Science.gov (United States)

    Diniaty, Artina; Fauzi'ah, Lina; Wulan Febriana, Beta; Arlianty, Widinda Normalia

    2017-12-01

    Instructional technology course must be studied by students in order to 1) understand the role of technology in learning, 2) capable of analyzing advantages and disadvantages of using technology in teaching, 3) capable of performing technology in teaching. A poster session in instructional technology course was performed to 1) enhance students' interest in this course and develop students' creativity. The step of this research includes: planning, implementation, and evaluation. The result showed that students' responses towards poster session in instructional technology course were good.

  15. Smart technology

    International Nuclear Information System (INIS)

    Bruckner, D.G.

    1991-01-01

    The success of smart technology in the pursuit of the Gulf War has accentuated the awareness of how the Safeguards and Security disciplines are changing in response to new weaponry. Throughout the Department of Energy Integrated Complex (IC) Safeguards and Security efforts such as: Protection Programs Operations; Materials, Controls and Accountability; Information Security; Computer Security; Operational Security; Personnel Security, Safeguards and/or Security (S and S) surveys, and Inspections and Evaluations are undergoing a reassessment and refocusing. Some of this is in response to such things as the DOE initiated Freeze Report and the Drell Report. An important aspect is also technological, adjusting the way business is done in light of the weapons, tools and processes/procedures becoming available. This paper addresses the S and S issues with the promise of using smart technology to develop new approaches and equipment across the IC

  16. Seafood Technology

    DEFF Research Database (Denmark)

    Børresen, Torger

    This presentation will fill the total picture of this conference between fisheries and aquaculture, blue biotech and bioconservation, by considering the optimal processing technology of marine resources from the raw material until the seafood reaches the plate of the consumer. The situation today...... must be performed such that total traceability and authenticity of the final products can be presented on demand. The most important aspects to be considered within seafood technology today are safety, healthy products and high eating quality. Safety can be divided into microbiological safety...... and not presenting any safety risk per se. Seafood is healthy due to the omega-3 fatty acids and the nutritional value of vitamins, peptides and proteins. The processing technology must however be performed such that these valuable features are not lost during production. The same applies to the eating quality. Any...

  17. Persuasive Technology

    DEFF Research Database (Denmark)

    This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included in this vol......This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included...... in this volume. The topics covered are emotions and user experience, ambient persuasive systems, persuasive design, persuasion profiles, designing for health, psychology of persuasion, embodied and conversational agents, economic incentives, and future directions for persuasive technology....

  18. Technology Management

    DEFF Research Database (Denmark)

    Pilkington, Alan

    2014-01-01

    This paper reports a bibliometric analysis (co-citation network analysis) of 10 journals in the management of technology (MOT) field. As well as introducing various bibliometric ideas, network analysis tools identify and explore the concepts covered by the field and their inter-relationships. Spe......This paper reports a bibliometric analysis (co-citation network analysis) of 10 journals in the management of technology (MOT) field. As well as introducing various bibliometric ideas, network analysis tools identify and explore the concepts covered by the field and their inter......-relationships. Specific results from different levels of analysis show the different dimensions of technology management: • Co-word terms identify themes • Journal co-citation network: linking to other disciplines • Co-citation network show concentrations of themes The analysis shows that MOT has a bridging role...

  19. Superconducting technology

    International Nuclear Information System (INIS)

    2010-01-01

    Superconductivity has a long history of about 100 years. Over the past 50 years, progress in superconducting materials has been mainly in metallic superconductors, such as Nb, Nb-Ti and Nb 3 Sn, resulting in the creation of various application fields based on the superconducting technologies. High-T c superconductors, the first of which was discovered in 1986, have been changing the future vision of superconducting technology through the development of new application fields such as power cables. On basis of these trends, future prospects of superconductor technology up to 2040 are discussed. In this article from the viewpoints of material development and the applications of superconducting wires and electronic devices. (author)

  20. Technology Transfer

    Science.gov (United States)

    Smith, Nanette R.

    1995-01-01

    The objective of this summer's work was to attempt to enhance Technology Application Group (TAG) ability to measure the outcomes of its efforts to transfer NASA technology. By reviewing existing literature, by explaining the economic principles involved in evaluating the economic impact of technology transfer, and by investigating the LaRC processes our William & Mary team has been able to lead this important discussion. In reviewing the existing literature, we identified many of the metrics that are currently being used in the area of technology transfer. Learning about the LaRC technology transfer processes and the metrics currently used to track the transfer process enabled us to compare other R&D facilities to LaRC. We discuss and diagram impacts of technology transfer in the short run and the long run. Significantly, it serves as the basis for analysis and provides guidance in thinking about what the measurement objectives ought to be. By focusing on the SBIR Program, valuable information regarding the strengths and weaknesses of this LaRC program are to be gained. A survey was developed to ask probing questions regarding SBIR contractors' experience with the program. Specifically we are interested in finding out whether the SBIR Program is accomplishing its mission, if the SBIR companies are providing the needed innovations specified by NASA and to what extent those innovations have led to commercial success. We also developed a survey to ask COTR's, who are NASA employees acting as technical advisors to the SBIR contractors, the same type of questions, evaluating the successes and problems with the SBIR Program as they see it. This survey was developed to be implemented interactively on computer. It is our hope that the statistical and econometric studies that can be done on the data collected from all of these sources will provide insight regarding the direction to take in developing systematic evaluations of programs like the SBIR Program so that they can