WorldWideScience

Sample records for multiple bioprocess analytes

  1. Raman spectroscopy as a process analytical technology for pharmaceutical manufacturing and bioprocessing.

    Science.gov (United States)

    Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R

    2017-01-01

    Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.

  2. Cell bioprocessing in space - Applications of analytical cytology

    Science.gov (United States)

    Todd, P.; Hymer, W. C.; Goolsby, C. L.; Hatfield, J. M.; Morrison, D. R.

    1988-01-01

    Cell bioprocessing experiments in space are reviewed and the development of on-board cell analytical cytology techniques that can serve such experiments is discussed. Methods and results of experiments involving the cultivation and separation of eukaryotic cells in space are presented. It is suggested that an advanced cytometer should be developed for the quantitative analysis of large numbers of specimens of suspended eukaryotic cells and bioparticles in experiments on the Space Station.

  3. Trends in Process Analytical Technology: Present State in Bioprocessing.

    Science.gov (United States)

    Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian

    2017-08-04

    Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.

  4. Capacity Planning for Batch and Perfusion Bioprocesses Across Multiple Biopharmaceutical Facilities

    OpenAIRE

    Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S

    2014-01-01

    Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fe...

  5. On-line soft sensing in upstream bioprocessing.

    Science.gov (United States)

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  6. On-line bioprocess monitoring - an academic discipline or an industrial tool?

    DEFF Research Database (Denmark)

    Olsson, Lisbeth; Schulze, Ulrik; Nielsen, Jens Bredal

    1998-01-01

    Bioprocess monitoring capabilities are gaining increasing Importance bath in physiological studies and in bioprocess development, The present article focuses on on-line analytical systems since these represent the backbone of most bioprocess monitoring systems, both in academia and in industry. W...

  7. [Progress in industrial bioprocess engineering in China].

    Science.gov (United States)

    Zhuang, Yingping; Chen, Hongzhang; Xia, Jianye; Tang, Wenjun; Zhao, Zhimin

    2015-06-01

    The advances of industrial biotechnology highly depend on the development of industrial bioprocess researches. In China, we are facing several challenges because of a huge national industrial fermentation capacity. The industrial bioprocess development experienced several main stages. This work mainly reviews the development of the industrial bioprocess in China during the past 30 or 40 years: including the early stage kinetics model study derived from classical chemical engineering, researching method based on control theory, multiple-parameter analysis techniques of on-line measuring instruments and techniques, and multi-scale analysis theory, and also solid state fermentation techniques and fermenters. In addition, the cutting edge of bioprocess engineering was also addressed.

  8. Disposable bioprocessing: the future has arrived.

    Science.gov (United States)

    Rao, Govind; Moreira, Antonio; Brorson, Kurt

    2009-02-01

    Increasing cost pressures are driving the rapid adoption of disposables in bioprocessing. While well ensconced in lab-scale operations, the lower operating/ validation costs at larger scale and relative ease of use are leading to these systems entering all stages and operations of a typical biopharmaceutical manufacturing process. Here, we focus on progress made in the incorporation of disposable equipment with sensor technology in bioprocessing throughout the development cycle. We note that sensor patch technology is mostly being adapted to disposable cell culture devices, but future adaptation to downstream steps is conceivable. Lastly, regulatory requirements are also briefly assessed in the context of disposables and the Process Analytical Technologies (PAT) and Quality by Design (QbD) initiatives.

  9. Sense and sensitivity in bioprocessing-detecting cellular metabolites with biosensors.

    Science.gov (United States)

    Dekker, Linda; Polizzi, Karen M

    2017-10-01

    Biosensors use biological elements to detect or quantify an analyte of interest. In bioprocessing, biosensors are employed to monitor key metabolites. There are two main types: fully biological systems or biological recognition coupled with physical/chemical detection. New developments in chemical biosensors include multiplexed detection using microfluidics. Synthetic biology can be used to engineer new biological biosensors with improved characteristics. Although there have been few biosensors developed for bioprocessing thus far, emerging trends can be applied in the future. A range of new platform technologies will enable rapid engineering of new biosensors based on transcriptional activation, riboswitches, and Förster Resonance Energy Transfer. However, translation to industry remains a challenge and more research into the robustness biosensors at scale is needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Developing a Continuous Bioprocessing Approach to Stromal Cell Manufacture.

    Science.gov (United States)

    Miotto, Martina; Gouveia, Ricardo; Abidin, Fadhilah Zainal; Figueiredo, Francisco; Connon, Che J

    2017-11-29

    To this day, the concept of continuous bioprocessing has been applied mostly to the manufacture of molecular biologics such as proteins, growth factors, and secondary metabolites with biopharmaceutical uses. The present work now sets to explore the potential application of continuous bioprocess methods to source large numbers of human adherent cells with potential therapeutic value. To this purpose, we developed a smart multifunctional surface coating capable of controlling the attachment, proliferation, and subsequent self-detachment of human corneal stromal cells. This system allowed the maintenance of cell cultures under steady-state growth conditions, where self-detaching cells were continuously replenished by the proliferation of those remaining attached. This facilitated a closed, continuous bioprocessing platform with recovery of approximately 1% of the total adherent cells per hour, a yield rate that was maintained for 1 month. Moreover, both attached and self-detached cells were shown to retain their original phenotype. Together, these results represent the proof-of-concept for a new high-throughput, high-standard, and low-cost biomanufacturing strategy with multiple potentials and important downstream applications.

  11. Bioprocessing research for energy applications

    Energy Technology Data Exchange (ETDEWEB)

    Scott, C.D.; Gaden, E.L. Jr.; Humphrey, A.E.; Carta, G.; Kirwan, D.J.

    1989-04-01

    The new biotechnology that is emerging could have a major impact on many of the industries important to our country, especially those associated with energy production and conservation. Advances in bioprocessing systems will provide important alternatives for the future utilization of various energy resources and for the control of environmental hazards that can result from energy generation. Although research in the fundamental biological sciences has helped set the scene for a ''new biotechnology,'' the major impediment to rapid commercialization for energy applications is the lack of a firm understanding of the necessary engineering concepts. Engineering research is now the essential ''bridge'' that will allow the development of a wide range of energy-related bioprocessing systems. A workshop entitled ''Bioprocessing Research for Energy Applications'' was held to address this technological area, to define the engineering research needs, and to identify those opportunities which would encourage rapid implementation of advanced bioprocessing concepts.

  12. Influence of multiple bioprocess parameters on production of lipase from Pseudomonas sp. BWS-5

    Directory of Open Access Journals (Sweden)

    Balwinder Singh Sooch

    2013-10-01

    Full Text Available The aim of the present work was to study the influence of multiple bioprocess parameters for the maximum production of lipase from Pseudomonas sp. BWS-5. The culture reached the stationary phase of growth after 36h of incubation when the maximum lipase production was obtained at flask level. The different media components such as carbon sources, nitrogen sources, trace elements and process parameters such as the pH of the medium, temperature and time of incubation, agitation/stationary conditions, etc. were optimized at flask level and at bioreactor level. The maximum enzyme production of 298 IU/mL was obtained with the use of simple medium with pH 6.5 containing glucose (1 %, w/v, peptone (3 %, w/v and KCl (0.05 %, w/v after 30h of incubation at 37°C under agitation (200 rpm conditions with 0.75 vvm of air supply.

  13. Capacity planning for batch and perfusion bioprocesses across multiple biopharmaceutical facilities.

    Science.gov (United States)

    Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S

    2014-01-01

    Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fed-batch or perfusion culture processes such as sequence-dependent changeover times, continuous culture constraints, and decoupled upstream and downstream operations that permit independent scheduling of each. Strategic inventory levels were accounted for by applying cost penalties when they were not met. A rolling time horizon methodology was utilized in conjunction with the MILP model and was shown to obtain solutions with greater optimality in less computational time than the full-scale model. The model was applied to an industrial case study to illustrate how the framework aids decisions regarding outsourcing capacity to third party manufacturers or building new facilities. The impact of variations on key parameters such as demand or titres on the optimal production plans and costs was captured. The analysis identified the critical ratio of in-house to contract manufacturing organization (CMO) manufacturing costs that led the optimization results to favor building a future facility over using a CMO. The tool predicted that if titres were higher than expected then the optimal solution would allocate more production to in-house facilities, where manufacturing costs were lower. Utilization graphs indicated when capacity expansion should be considered. © 2014 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  14. Application of Raman Spectroscopy and Univariate Modelling As a Process Analytical Technology for Cell Therapy Bioprocessing

    Science.gov (United States)

    Baradez, Marc-Olivier; Biziato, Daniela; Hassan, Enas; Marshall, Damian

    2018-01-01

    Cell therapies offer unquestionable promises for the treatment, and in some cases even the cure, of complex diseases. As we start to see more of these therapies gaining market authorization, attention is turning to the bioprocesses used for their manufacture, in particular the challenge of gaining higher levels of process control to help regulate cell behavior, manage process variability, and deliver product of a consistent quality. Many processes already incorporate the measurement of key markers such as nutrient consumption, metabolite production, and cell concentration, but these are often performed off-line and only at set time points in the process. Having the ability to monitor these markers in real-time using in-line sensors would offer significant advantages, allowing faster decision-making and a finer level of process control. In this study, we use Raman spectroscopy as an in-line optical sensor for bioprocess monitoring of an autologous T-cell immunotherapy model produced in a stirred tank bioreactor system. Using reference datasets generated on a standard bioanalyzer, we develop chemometric models from the Raman spectra for glucose, glutamine, lactate, and ammonia. These chemometric models can accurately monitor donor-specific increases in nutrient consumption and metabolite production as the primary T-cell transition from a recovery phase and begin proliferating. Using a univariate modeling approach, we then show how changes in peak intensity within the Raman spectra can be correlated with cell concentration and viability. These models, which act as surrogate markers, can be used to monitor cell behavior including cell proliferation rates, proliferative capacity, and transition of the cells to a quiescent phenotype. Finally, using the univariate models, we also demonstrate how Raman spectroscopy can be applied for real-time monitoring. The ability to measure these key parameters using an in-line Raman optical sensor makes it possible to have immediate

  15. Application of Raman Spectroscopy and Univariate Modelling As a Process Analytical Technology for Cell Therapy Bioprocessing.

    Science.gov (United States)

    Baradez, Marc-Olivier; Biziato, Daniela; Hassan, Enas; Marshall, Damian

    2018-01-01

    Cell therapies offer unquestionable promises for the treatment, and in some cases even the cure, of complex diseases. As we start to see more of these therapies gaining market authorization, attention is turning to the bioprocesses used for their manufacture, in particular the challenge of gaining higher levels of process control to help regulate cell behavior, manage process variability, and deliver product of a consistent quality. Many processes already incorporate the measurement of key markers such as nutrient consumption, metabolite production, and cell concentration, but these are often performed off-line and only at set time points in the process. Having the ability to monitor these markers in real-time using in-line sensors would offer significant advantages, allowing faster decision-making and a finer level of process control. In this study, we use Raman spectroscopy as an in-line optical sensor for bioprocess monitoring of an autologous T-cell immunotherapy model produced in a stirred tank bioreactor system. Using reference datasets generated on a standard bioanalyzer, we develop chemometric models from the Raman spectra for glucose, glutamine, lactate, and ammonia. These chemometric models can accurately monitor donor-specific increases in nutrient consumption and metabolite production as the primary T-cell transition from a recovery phase and begin proliferating. Using a univariate modeling approach, we then show how changes in peak intensity within the Raman spectra can be correlated with cell concentration and viability. These models, which act as surrogate markers, can be used to monitor cell behavior including cell proliferation rates, proliferative capacity, and transition of the cells to a quiescent phenotype. Finally, using the univariate models, we also demonstrate how Raman spectroscopy can be applied for real-time monitoring. The ability to measure these key parameters using an in-line Raman optical sensor makes it possible to have immediate

  16. Application of Raman Spectroscopy and Univariate Modelling As a Process Analytical Technology for Cell Therapy Bioprocessing

    Directory of Open Access Journals (Sweden)

    Marc-Olivier Baradez

    2018-03-01

    Full Text Available Cell therapies offer unquestionable promises for the treatment, and in some cases even the cure, of complex diseases. As we start to see more of these therapies gaining market authorization, attention is turning to the bioprocesses used for their manufacture, in particular the challenge of gaining higher levels of process control to help regulate cell behavior, manage process variability, and deliver product of a consistent quality. Many processes already incorporate the measurement of key markers such as nutrient consumption, metabolite production, and cell concentration, but these are often performed off-line and only at set time points in the process. Having the ability to monitor these markers in real-time using in-line sensors would offer significant advantages, allowing faster decision-making and a finer level of process control. In this study, we use Raman spectroscopy as an in-line optical sensor for bioprocess monitoring of an autologous T-cell immunotherapy model produced in a stirred tank bioreactor system. Using reference datasets generated on a standard bioanalyzer, we develop chemometric models from the Raman spectra for glucose, glutamine, lactate, and ammonia. These chemometric models can accurately monitor donor-specific increases in nutrient consumption and metabolite production as the primary T-cell transition from a recovery phase and begin proliferating. Using a univariate modeling approach, we then show how changes in peak intensity within the Raman spectra can be correlated with cell concentration and viability. These models, which act as surrogate markers, can be used to monitor cell behavior including cell proliferation rates, proliferative capacity, and transition of the cells to a quiescent phenotype. Finally, using the univariate models, we also demonstrate how Raman spectroscopy can be applied for real-time monitoring. The ability to measure these key parameters using an in-line Raman optical sensor makes it possible

  17. Capacity Planning for Batch and Perfusion Bioprocesses Across Multiple Biopharmaceutical Facilities

    Science.gov (United States)

    Siganporia, Cyrus C; Ghosh, Soumitra; Daszkowski, Thomas; Papageorgiou, Lazaros G; Farid, Suzanne S

    2014-01-01

    Production planning for biopharmaceutical portfolios becomes more complex when products switch between fed-batch and continuous perfusion culture processes. This article describes the development of a discrete-time mixed integer linear programming (MILP) model to optimize capacity plans for multiple biopharmaceutical products, with either batch or perfusion bioprocesses, across multiple facilities to meet quarterly demands. The model comprised specific features to account for products with fed-batch or perfusion culture processes such as sequence-dependent changeover times, continuous culture constraints, and decoupled upstream and downstream operations that permit independent scheduling of each. Strategic inventory levels were accounted for by applying cost penalties when they were not met. A rolling time horizon methodology was utilized in conjunction with the MILP model and was shown to obtain solutions with greater optimality in less computational time than the full-scale model. The model was applied to an industrial case study to illustrate how the framework aids decisions regarding outsourcing capacity to third party manufacturers or building new facilities. The impact of variations on key parameters such as demand or titres on the optimal production plans and costs was captured. The analysis identified the critical ratio of in-house to contract manufacturing organization (CMO) manufacturing costs that led the optimization results to favor building a future facility over using a CMO. The tool predicted that if titres were higher than expected then the optimal solution would allocate more production to in-house facilities, where manufacturing costs were lower. Utilization graphs indicated when capacity expansion should be considered. © 2013 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 30:594–606, 2014 PMID:24376262

  18. Modeling and simulation of the bioprocess with recirculation

    Directory of Open Access Journals (Sweden)

    Žerajić Stanko

    2007-01-01

    Full Text Available The bioprocess models with recirculation present an integration of the model of continuous bioreaction system and the model of separation system. The reaction bioprocess is integrated with separation the biomass, formed product, no consumed substrate or inhibitory substance. In this paper the simulation model of recirculation bioprocess was developed, which may be applied for increasing the biomass productivity and product biosynthesis increasing the conversion of a substrate-to-product, mixing efficiency and secondary C02 separation. The goal of the work is optimal bioprocess configuration, which is determined by simulation optimization. The optimal hemostat state was used as referent. Step-by-step simulation method is necessary because the initial bioprocess state is changing with recirculation in each step. The simulation experiment confirms that at the recirculation ratio a. = 0.275 and the concentration factor C = 4 the maximum glucose conversion to ethanol and at a dilution rate ten times larger.

  19. Facilitating Multiple Intelligences Through Multimodal Learning Analytics

    Directory of Open Access Journals (Sweden)

    Ayesha PERVEEN

    2018-01-01

    Full Text Available This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner’s 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as multiple intelligences by online education systems and then suggests a framework of the advanced form of learning analytics i.e., multimodal learning analytics for tracing and facilitating multiple intelligences while they are engaged in online ubiquitous learning. As multimodal learning analytics is still an evolving area, it poses many challenges for technologists, educationists as well as organizational managers. Learning analytics make machines meet humans, therefore, the educationists with an expertise in learning theories can help technologists devise latest technological methods for multimodal learning analytics and organizational managers can implement them for the improvement of online education. Therefore, a careful instructional design based on a deep understanding of students’ learning abilities, is required to develop teaching plans and technological possibilities for monitoring students’ learning paths. This is how learning analytics can help design an adaptive instructional design based on a quick analysis of the data gathered. Based on that analysis, the academicians can critically reflect upon the quick or delayed implementation of the existing instructional design based on students’ cognitive abilities or even about the single or double loop learning design. The researcher concludes that the online education is multimodal in nature, has the capacity to endorse multiliteracies and, therefore, multiple intelligences can be tracked and facilitated through multimodal learning analytics in an online mode. However, online teachers’ training both in technological implementations and

  20. Bioprocessing of sewage sludge for safe recycling on agricultural land - BIOWASTE

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, Jens Ejbye; Angelidaki, Irini; Christensen, Nina; Batstone, Damien John; Lyberatos, Gerasimos; Stamatelatou, Katerina; Lichtfouse, Eric; Elbisser, Brigitte; Rogers, Kayne; Sappin-Didier, Valerie; Dernaix, Laurence; Caria; Giovanni; Metzger, Laure; Borghi, Veronica; Montcada, Eloi

    2003-07-01

    Disposal and handling of sewage sludge are increasing problems in Europe due to the increasing quantities of the sewage sludge produced. A large amount of the sewage sludge contains small fractions of toxic chemicals, which results in problems with safe use of the sewage sludge on agricultural land. From an ecological and economical point of view, it would be essential to establish methodologies, which could allow sewage sludge to be reused as fertilizers on agricultural land. Energy efficient biotreatment processes of organic waste are, therefore, of crucial importance. BIOWASTE will offer an integrated study of this area. The typical composition of sewage sludge will be characterized with regard to key contaminating compounds. The following compounds will be in focus: Emulsifying agents such as nonylphenols and nonylphenol ethoxylates (NPE), polycyclic aromatic hydrocarbons (PAHs) derived from incomplete combustion processes and phthalates, which are used as additives in plastics and surfactants such as linear alkyl benzene sulfonate (LAS). Analytical techniques suitable for qualitative and quantitative evaluation of the chemical species involved in the processes under investigation will be determined. Bacteria that are able to degrade selected contaminating compounds under anaerobic and aerobic conditions will be isolated, characterized and bioaugmented for decontamination of sewage sludge through bioprocessing. Aerobic, anaerobic and combination of aerobic/anaerobic bioprocessing of sewage sludge will be applied. A mathematical model will be developed to describe the biodegradation processes of the contaminating compounds after establishing the kinetic parameters for degradation of contaminating compounds. The bioprocessed sewage sludge will be used in eco- and plant- toxicology tests to evaluate the impact of the xenobiotics on the environment. Methodologies will be developed and applied to assess the cleanliness of the bioprocessing as a safe method for waste

  1. Miniature Bioprocess Array: A Platform for Quantitative Physiology and Bioprocess Optimization

    National Research Council Canada - National Science Library

    Keasling, Jay

    2002-01-01

    .... The miniature bioprocess array is based on an array of 150-microliters wells, each one of which incorporates MEMS for the closed-loop control of cell culture parameters such as temperature, pH, and dissolved oxygen...

  2. Nanobiocatalyst advancements and bioprocessing applications.

    Science.gov (United States)

    Misson, Mailin; Zhang, Hu; Jin, Bo

    2015-01-06

    The nanobiocatalyst (NBC) is an emerging innovation that synergistically integrates advanced nanotechnology with biotechnology and promises exciting advantages for improving enzyme activity, stability, capability and engineering performances in bioprocessing applications. NBCs are fabricated by immobilizing enzymes with functional nanomaterials as enzyme carriers or containers. In this paper, we review the recent developments of novel nanocarriers/nanocontainers with advanced hierarchical porous structures for retaining enzymes, such as nanofibres (NFs), mesoporous nanocarriers and nanocages. Strategies for immobilizing enzymes onto nanocarriers made from polymers, silicas, carbons and metals by physical adsorption, covalent binding, cross-linking or specific ligand spacers are discussed. The resulting NBCs are critically evaluated in terms of their bioprocessing performances. Excellent performances are demonstrated through enhanced NBC catalytic activity and stability due to conformational changes upon immobilization and localized nanoenvironments, and NBC reutilization by assembling magnetic nanoparticles into NBCs to defray the high operational costs associated with enzyme production and nanocarrier synthesis. We also highlight several challenges associated with the NBC-driven bioprocess applications, including the maturation of large-scale nanocarrier synthesis, design and development of bioreactors to accommodate NBCs, and long-term operations of NBCs. We suggest these challenges are to be addressed through joint collaboration of chemists, engineers and material scientists. Finally, we have demonstrated the great potential of NBCs in manufacturing bioprocesses in the near future through successful laboratory trials of NBCs in carbohydrate hydrolysis, biofuel production and biotransformation. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  3. Control of Bioprocesses

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted

    2015-01-01

    The purpose of bioprocess control is to ensure that the plant operates as designed. This chapter presents the fundamental principles for control of biochemical processes. Through examples, the selection of manipulated and controlled variables in the classical reactor configurations is discussed, so...... are control objectives and the challenges in obtaining good control of the bioreactor. The objective of this chapter is to discuss the bioreactor control problems and to highlight some general traits that distinguish operation of bioprocesses from operation of processes in the conventional chemical process...... industries. It also provides a number of typical control loops for different objectives. A brief introduction to the general principles of process control, the PID control algorithm is discussed, and the design and effect of tuning are shown in an example. Finally, a discussion of novel, model-free control...

  4. Guiding bioprocess design by microbial ecology.

    Science.gov (United States)

    Volmer, Jan; Schmid, Andreas; Bühler, Bruno

    2015-06-01

    Industrial bioprocess development is driven by profitability and eco-efficiency. It profits from an early stage definition of process and biocatalyst design objectives. Microbial bioprocess environments can be considered as synthetic technical microbial ecosystems. Natural systems follow Darwinian evolution principles aiming at survival and reproduction. Technical systems objectives are eco-efficiency, productivity, and profitable production. Deciphering technical microbial ecology reveals differences and similarities of natural and technical systems objectives, which are discussed in this review in view of biocatalyst and process design and engineering strategies. Strategies for handling opposing objectives of natural and technical systems and for exploiting and engineering natural properties of microorganisms for technical systems are reviewed based on examples. This illustrates the relevance of considering microbial ecology for bioprocess design and the potential for exploitation by synthetic biology strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy.

    Science.gov (United States)

    Sonnleitner, Bernhard

    2013-01-01

    The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.

  6. Aspects of modelling and control of bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiachang

    1995-12-31

    The modelling and control of bioprocesses are the main subjects in this thesis. Different modelling approaches are proposed for different purposes in various bioprocesses. A conventional global model was constructed for a very complex mammalian cell culture process. A new concept of functional state and a multiple model (local models) approach were used for modelling the fed-batch baker`s yeast process for monitoring and control purposes. Finally, a combination of conventional electrical and biological models was used to simulate and to control a microbial fuel cell process. In the thesis, a yeast growth process was taken as an example to demonstrate the usefulness of the functional state concept and local models. The functional states were first defined according to the yeast metabolism. The process was then described by a set of simple local models. In different functional states, different local models were used. On the other hand, the on-line estimation of functional state and biomass of the process was discussed for process control purpose. As a consequence, both the functional state concept and the multiple model approach were applied for fuzzy logic control of yeast growth process. A fuzzy factor was calculated on the basis of a knowledge-based expert system and fuzzy logic rules. The factor was used to correct an ideal substrate feed rate. In the last part of the thesis, microbial fuel cell processes were studied. A microbial fuel cell is a device for direct conversion of chemical energy to electrical energy by using micro-organisms as catalysts. A combined model including conventional electrical and biological models was constructed for the process based on the biological and electrochemical phenomena

  7. Aspects of modelling and control of bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiachang

    1996-12-31

    The modelling and control of bioprocesses are the main subjects in this thesis. Different modelling approaches are proposed for different purposes in various bioprocesses. A conventional global model was constructed for a very complex mammalian cell culture process. A new concept of functional state and a multiple model (local models) approach were used for modelling the fed-batch baker`s yeast process for monitoring and control purposes. Finally, a combination of conventional electrical and biological models was used to simulate and to control a microbial fuel cell process. In the thesis, a yeast growth process was taken as an example to demonstrate the usefulness of the functional state concept and local models. The functional states were first defined according to the yeast metabolism. The process was then described by a set of simple local models. In different functional states, different local models were used. On the other hand, the on-line estimation of functional state and biomass of the process was discussed for process control purpose. As a consequence, both the functional state concept and the multiple model approach were applied for fuzzy logic control of yeast growth process. A fuzzy factor was calculated on the basis of a knowledge-based expert system and fuzzy logic rules. The factor was used to correct an ideal substrate feed rate. In the last part of the thesis, microbial fuel cell processes were studied. A microbial fuel cell is a device for direct conversion of chemical energy to electrical energy by using micro-organisms as catalysts. A combined model including conventional electrical and biological models was constructed for the process based on the biological and electrochemical phenomena

  8. Facilitating Multiple Intelligences through Multimodal Learning Analytics

    Science.gov (United States)

    Perveen, Ayesha

    2018-01-01

    This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…

  9. Establishing new microbial cell factories for sustainable bioprocesses

    DEFF Research Database (Denmark)

    Workman, Mhairi; Holt, Philippe; Liu, Xiaoying

    2012-01-01

    . The application of biological catalysts which can convert a variety of substrates to an array of desirable products has been demonstrated in both ancient bioprocesses and modern industrial biotechnology. In recent times, focus has been on a limited number of “model” organisms which have been extensively exploited...... of products, it may be interesting to look to less domesticated strains and towards more non-conventional hosts in the development of new bioprocesses. This approach demands thorough physiological characterization as well as establishment of tools for genetic engineering if new cell factories......The demands of modern society are increasing pressure on natural resources while creating the need for a wider range of products. There is an interest in developing bioprocesses to meet these demands, with conversion of a variety of waste materials providing the basis for a sustainable society...

  10. Incorporating unnatural amino acids to engineer biocatalysts for industrial bioprocess applications.

    Science.gov (United States)

    Ravikumar, Yuvaraj; Nadarajan, Saravanan Prabhu; Hyeon Yoo, Tae; Lee, Chong-Soon; Yun, Hyungdon

    2015-12-01

    The bioprocess engineering with biocatalysts broadly spans its development and actual application of enzymes in an industrial context. Recently, both the use of bioprocess engineering and the development and employment of enzyme engineering techniques have been increasing rapidly. Importantly, engineering techniques that incorporate unnatural amino acids (UAAs) in vivo has begun to produce enzymes with greater stability and altered catalytic properties. Despite the growth of this technique, its potential value in bioprocess applications remains to be fully exploited. In this review, we explore the methodologies involved in UAA incorporation as well as ways to synthesize these UAAs. In addition, we summarize recent efforts to increase the yield of UAA engineered proteins in Escherichia coli and also the application of this tool in enzyme engineering. Furthermore, this protein engineering tool based on the incorporation of UAA can be used to develop immobilized enzymes that are ideal for bioprocess applications. Considering the potential of this tool and by exploiting these engineered enzymes, we expect the field of bioprocess engineering to open up new opportunities for biocatalysis in the near future. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Bioprocessing of a stored mixed liquid waste

    Energy Technology Data Exchange (ETDEWEB)

    Wolfram, J.H.; Rogers, R.D. [Idaho National Engineering Lab., Idaho Falls, ID (United States); Finney, R. [Mound Applied Technologies, Miamisburg, OH (United States)] [and others

    1995-12-31

    This paper describes the development and results of a demonstration for a continuous bioprocess for mixed waste treatment. A key element of the process is an unique microbial strain which tolerates high levels of aromatic solvents and surfactants. This microorganism is the biocatalysis of the continuous flow system designed for the processing of stored liquid scintillation wastes. During the past year a process demonstration has been conducted on commercial formulation of liquid scintillation cocktails (LSC). Based on data obtained from this demonstration, the Ohio EPA granted the Mound Applied Technologies Lab a treatability permit allowing the limited processing of actual mixed waste. Since August 1994, the system has been successfully processing stored, {open_quotes}hot{close_quotes} LSC waste. The initial LSC waste fed into the system contained 11% pseudocumene and detectable quantities of plutonium. Another treated waste stream contained pseudocumene and tritium. Data from this initial work shows that the hazardous organic solvent, and pseudocumene have been removed due to processing, leaving the aqueous low level radioactive waste. Results to date have shown that living cells are not affected by the dissolved plutonium and that 95% of the plutonium was sorbed to the biomass. This paper discusses the bioprocess, rates of processing, effluent, and the implications of bioprocessing for mixed waste management.

  12. Stem cell bioprocessing: fundamentals and principles.

    Science.gov (United States)

    Placzek, Mark R; Chung, I-Ming; Macedo, Hugo M; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Cha, Jae Min; Fauzi, Iliana; Kang, Yunyi; Yeo, David C L; Ma, Chi Yip Joan; Polak, Julia M; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2009-03-06

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the 'omics' technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical-failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications.

  13. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  14. Quantitative feature extraction from the Chinese hamster ovary bioprocess bibliome using a novel meta-analysis workflow

    DEFF Research Database (Denmark)

    Golabgir, Aydin; Gutierrez, Jahir M.; Hefzi, Hooman

    2016-01-01

    compilation covers all published CHO cell studies from 1995 to 2015, and each study is classified by the types of phenotypic and bioprocess data contained therein. Using data from selected studies, we also present a quantitative meta-analysis of bioprocess characteristics across diverse culture conditions...... practices can limit research re-use in this field, we show that the statistical analysis of diverse legacy bioprocess data can provide insight into bioprocessing capabilities of CHO cell lines used in industry. The CHO bibliome can be accessed at http://lewislab.ucsd.edu/cho-bibliome/....

  15. Potentials and limitations of miniaturized calorimeters for bioprocess monitoring.

    Science.gov (United States)

    Maskow, Thomas; Schubert, Torsten; Wolf, Antje; Buchholz, Friederike; Regestein, Lars; Buechs, Jochen; Mertens, Florian; Harms, Hauke; Lerchner, Johannes

    2011-10-01

    In theory, heat production rates are very well suited for analysing and controlling bioprocesses on different scales from a few nanolitres up to many cubic metres. Any bioconversion is accompanied by a production (exothermic) or consumption (endothermic) of heat. The heat is tightly connected with the stoichiometry of the bioprocess via the law of Hess, and its rate is connected to the kinetics of the process. Heat signals provide real-time information of bioprocesses. The combination of heat measurements with respirometry is theoretically suited for the quantification of the coupling between catabolic and anabolic reactions. Heat measurements have also practical advantages. Unlike most other biochemical sensors, thermal transducers can be mounted in a protected way that prevents fouling, thereby minimizing response drifts. Finally, calorimetry works in optically opaque solutions and does not require labelling or reactants. It is surprising to see that despite all these advantages, calorimetry has rarely been applied to monitor and control bioprocesses with intact cells in the laboratory, industrial bioreactors or ecosystems. This review article analyses the reasons for this omission, discusses the additional information calorimetry can provide in comparison with respirometry and presents miniaturization as a potential way to overcome some inherent weaknesses of conventional calorimetry. It will be discussed for which sample types and scientific question miniaturized calorimeter can be advantageously applied. A few examples from different fields of microbiological and biotechnological research will illustrate the potentials and limitations of chip calorimetry. Finally, the future of chip calorimetry is addressed in an outlook.

  16. Fluorescence Spectroscopy and Chemometric Modeling for Bioprocess Monitoring

    Directory of Open Access Journals (Sweden)

    Saskia M. Faassen

    2015-04-01

    Full Text Available On-line sensors for the detection of crucial process parameters are desirable for the monitoring, control and automation of processes in the biotechnology, food and pharma industry. Fluorescence spectroscopy as a highly developed and non-invasive technique that enables the on-line measurements of substrate and product concentrations or the identification of characteristic process states. During a cultivation process significant changes occur in the fluorescence spectra. By means of chemometric modeling, prediction models can be calculated and applied for process supervision and control to provide increased quality and the productivity of bioprocesses. A range of applications for different microorganisms and analytes has been proposed during the last years. This contribution provides an overview of different analysis methods for the measured fluorescence spectra and the model-building chemometric methods used for various microbial cultivations. Most of these processes are observed using the BioView® Sensor, thanks to its robustness and insensitivity to adverse process conditions. Beyond that, the PLS-method is the most frequently used chemometric method for the calculation of process models and prediction of process variables.

  17. Fluorescence Spectroscopy and Chemometric Modeling for Bioprocess Monitoring

    Science.gov (United States)

    Faassen, Saskia M.; Hitzmann, Bernd

    2015-01-01

    On-line sensors for the detection of crucial process parameters are desirable for the monitoring, control and automation of processes in the biotechnology, food and pharma industry. Fluorescence spectroscopy as a highly developed and non-invasive technique that enables the on-line measurements of substrate and product concentrations or the identification of characteristic process states. During a cultivation process significant changes occur in the fluorescence spectra. By means of chemometric modeling, prediction models can be calculated and applied for process supervision and control to provide increased quality and the productivity of bioprocesses. A range of applications for different microorganisms and analytes has been proposed during the last years. This contribution provides an overview of different analysis methods for the measured fluorescence spectra and the model-building chemometric methods used for various microbial cultivations. Most of these processes are observed using the BioView® Sensor, thanks to its robustness and insensitivity to adverse process conditions. Beyond that, the PLS-method is the most frequently used chemometric method for the calculation of process models and prediction of process variables. PMID:25942644

  18. Design and analysis of heat recovery system in bioprocess plant

    International Nuclear Information System (INIS)

    Anastasovski, Aleksandar; Rašković, Predrag; Guzović, Zvonimir

    2015-01-01

    Highlights: • Heat integration of a bioprocess plant is studied. • Bioprocess plant produces yeast and ethyl-alcohol. • The design of a heat recovery system is performed by batch pinch analysis. • Direct and indirect heat integration approaches are used in process design. • The heat recovery system without a heat storage opportunity is more profitable. - Abstract: The paper deals with the heat integration of a bioprocess plant which produces yeast and ethyl-alcohol. The referent plant is considered to be a multiproduct batch plant which operates in a semi-continuous mode. The design of a heat recovery system is performed by batch pinch analysis and by the use of the Time slice model. The results obtained by direct and indirect heat integration approaches are presented in the form of cost-optimal heat exchanger networks and evaluated by different thermodynamic and economic indicators. They signify that the heat recovery system without a heat storage opportunity can be considered to be a more profitable solution for the energy efficiency increase in a plant

  19. Virtual parameter-estimation experiments in Bioprocess-Engineering education

    NARCIS (Netherlands)

    Sessink, O.D.T.; Beeftink, H.H.; Hartog, R.J.M.; Tramper, J.

    2006-01-01

    Cell growth kinetics and reactor concepts constitute essential knowledge for Bioprocess-Engineering students. Traditional learning of these concepts is supported by lectures, tutorials, and practicals: ICT offers opportunities for improvement. A virtual-experiment environment was developed that

  20. Application of Hydrodynamic Cavitation for Food and Bioprocessing

    Science.gov (United States)

    Gogate, Parag R.

    Hydrodynamic cavitation can be simply generated by the alterations in the flow field in high speed/high pressure devices and also by passage of the liquid through a constriction such as orifice plate, venturi, or throttling valve. Hydrodynamic cavitation results in the formation of local hot spots, release of highly reactive free radicals, and enhanced mass transfer rates due to turbulence generated as a result of liquid circulation currents. These conditions can be suitably applied for intensification of different bioprocessing applications in an energy-efficient manner as compared to conventionally used ultrasound-based reactors. The current chapter aims at highlighting different aspects related to hydrodynamic cavitation, including the theoretical aspects for optimization of operating parameters, reactor designs, and overview of applications relevant to food and bioprocessing. Some case studies highlighting the comparison of hydrodynamic cavitation and acoustic cavitation reactors will also be discussed.

  1. Multiple reaction monitoring targeted LC-MS analysis of potential cell death marker proteins for increased bioprocess control.

    Science.gov (United States)

    Albrecht, Simone; Kaisermayer, Christian; Reinhart, David; Ambrose, Monica; Kunert, Renate; Lindeberg, Anna; Bones, Jonathan

    2018-05-01

    The monitoring of protein biomarkers for the early prediction of cell stress and death is a valuable tool for process characterization and efficient biomanufacturing control. A representative set of six proteins, namely GPDH, PRDX1, LGALS1, CFL1, TAGLN2 and MDH, which were identified in a previous CHO-K1 cell death model using discovery LC-MS E was translated into a targeted liquid chromatography multiple reaction monitoring mass spectrometry (LC-MRM-MS) platform and verified. The universality of the markers was confirmed in a cell growth model for which three Chinese hamster ovary host cell lines (CHO-K1, CHO-S, CHO-DG44) were grown in batch culture in two different types of basal media. LC-MRM-MS was also applied to spent media (n = 39) from four perfusion biomanufacturing series. Stable isotope-labelled peptide analogues and a stable isotope-labelled monoclonal antibody were used for improved protein quantitation and simultaneous monitoring of the workflow reproducibility. Significant increases in protein concentrations were observed for all viability marker proteins upon increased dead cell numbers and allowed for discrimination of spent media with dead cell densities below and above 1 × 10 6  dead cells/mL which highlights the potential of the selected viability marker proteins in bioprocess control. Graphical abstract Overview of the LC-MRM-MS workflow for the determination of proteomic markers in conditioned media from the bioreactor that correlate with CHO cell death.

  2. Soft sensors in bioprocessing: A status report and recommendations

    DEFF Research Database (Denmark)

    Luttmann, Reiner; Bracewell, Daniel G.; Cornelissen, Gesine

    2012-01-01

    The following report with recommendations is the result of an expert panel meeting on soft sensor applications in bioprocess engineering that was organized by the Measurement, Monitoring, Modelling and Control (M3C) Working Group of the European Federation of Biotechnology - Section of Biochemical...... Engineering Science (ESBES). The aim of the panel was to provide an update on the present status of the subject and to identify critical needs and issues for the furthering of the successful development of soft sensor methods in bioprocess engineering research and for industrial applications, in particular...... with focus on biopharmaceutical applications. It concludes with a set of recommendations, which highlight current prospects for the extended use of soft sensors and those areas requiring development....

  3. White paper on continuous bioprocessing. May 20-21, 2014 Continuous Manufacturing Symposium.

    Science.gov (United States)

    Konstantinov, Konstantin B; Cooney, Charles L

    2015-03-01

    There is a growing interest in realizing the benefits of continuous processing in biologics manufacturing, which is reflected by the significant number of industrial and academic researchers who are actively involved in the development of continuous bioprocessing systems. These efforts are further encouraged by guidance expressed in recent US FDA conference presentations. The advantages of continuous manufacturing include sustained operation with consistent product quality, reduced equipment size, high-volumetric productivity, streamlined process flow, low-process cycle times, and reduced capital and operating cost. This technology, however, poses challenges, which need to be addressed before routine implementation is considered. This paper, which is based on the available literature and input from a large number of reviewers, is intended to provide a consensus of the opportunities, technical needs, and strategic directions for continuous bioprocessing. The discussion is supported by several examples illustrating various architectures of continuous bioprocessing systems. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  4. Efficient and reproducible mammalian cell bioprocesses without probes and controllers?

    Science.gov (United States)

    Tissot, Stéphanie; Oberbek, Agata; Reclari, Martino; Dreyer, Matthieu; Hacker, David L; Baldi, Lucia; Farhat, Mohamed; Wurm, Florian M

    2011-07-01

    Bioprocesses for recombinant protein production with mammalian cells are typically controlled for several physicochemical parameters including the pH and dissolved oxygen concentration (DO) of the culture medium. Here we studied whether these controls are necessary for efficient and reproducible bioprocesses in an orbitally shaken bioreactor (OSR). Mixing, gas transfer, and volumetric power consumption (P(V)) were determined in both a 5-L OSR and a 3-L stirred-tank bioreactor (STR). The two cultivation systems had a similar mixing intensity, but the STR had a lower volumetric mass transfer coefficient of oxygen (k(L)a) and a higher P(V) than the OSR. Recombinant CHO cell lines expressing either tumor necrosis factor receptor as an Fc fusion protein (TNFR:Fc) or an anti-RhesusD monoclonal antibody were cultivated in the two systems. The 5-L OSR was operated in an incubator shaker with 5% CO(2) in the gas environment but without pH and DO control whereas the STR was operated with or without pH and DO control. Higher cell densities and recombinant protein titers were obtained in the OSR as compared to both the controlled and the non-controlled STRs. To test the reproducibility of a bioprocess in a non-controlled OSR, the two CHO cell lines were each cultivated in parallel in six 5-L OSRs. Similar cell densities, cell viabilities, and recombinant protein titers along with similar pH and DO profiles were achieved in each group of replicates. Our study demonstrated that bioprocesses can be performed in OSRs without pH or DO control in a highly reproducible manner, at least at the scale of operation studied here. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Bioprocess intensification for the effective production of chemical products

    DEFF Research Database (Denmark)

    Woodley, John

    2017-01-01

    The further implementation of new bioprocesses, using biocatalysts in various formats, for the synthesis of chemicals is highly dependent upon effective process intensification. The need for process intensification reflects the fact that the conditions under which a biocatalyst carries out...... a reaction in nature are far from those which are optimal for industrial processes. In this paper the rationale for intensification will be discussed, as well as the four complementary approaches used today to achieve bioprocess intensification. Two of these four approaches are based on alteration...... of the biocatalyst (either by protein engineering or metabolic engineering), resulting in an extra degree of freedom in the process design. To date, biocatalyst engineering has been developed independently from the conventional process engineering methodology to intensification. Although the integration of these two...

  6. Bioprocesses: Modelling needs for process evaluation and sustainability assessment

    DEFF Research Database (Denmark)

    Jiménez-Gonzaléz, Concepcion; Woodley, John

    2010-01-01

    development such that they can also be used to evaluate processes against sustainability metrics, as well as economics as an integral part of assessments. Finally, property models will also be required based on compounds not currently present in existing databases. It is clear that many new opportunities......The next generation of process engineers will face a new set of challenges, with the need to devise new bioprocesses, with high selectivity for pharmaceutical manufacture, and for lower value chemicals manufacture based on renewable feedstocks. In this paper the current and predicted future roles...... of process system engineering and life cycle inventory and assessment in the design, development and improvement of sustainable bioprocesses are explored. The existing process systems engineering software tools will prove essential to assist this work. However, the existing tools will also require further...

  7. Membrane Bioprocesses for Pharmaceutical Micropollutant Removal from Waters

    Directory of Open Access Journals (Sweden)

    Matthias de Cazes

    2014-10-01

    Full Text Available The purpose of this review work is to give an overview of the research reported on bioprocesses for the treatment of domestic or industrial wastewaters (WW containing pharmaceuticals. Conventional WW treatment technologies are not efficient enough to completely remove all pharmaceuticals from water. Indeed, these compounds are becoming an actual public health problem, because they are more and more present in underground and even in potable waters. Different types of bioprocesses are described in this work: from classical activated sludge systems, which allow the depletion of pharmaceuticals by bio-degradation and adsorption, to enzymatic reactions, which are more focused on the treatment of WW containing a relatively high content of pharmaceuticals and less organic carbon pollution than classical WW. Different aspects concerning the advantages of membrane bioreactors for pharmaceuticals removal are discussed, as well as the more recent studies on enzymatic membrane reactors to the depletion of these recalcitrant compounds.

  8. Synthesis and characterization of robust magnetic carriers for bioprocess applications

    Energy Technology Data Exchange (ETDEWEB)

    Kopp, Willian, E-mail: willkopp@gmail.com [Federal University of São Carlos-UFSCar, Graduate Program in Chemical Engineering, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Silva, Felipe A., E-mail: eq.felipe.silva@gmail.com [Federal University of São Carlos-UFSCar, Graduate Program in Chemical Engineering, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Lima, Lionete N., E-mail: lionetenunes@yahoo.com.br [Federal University of São Carlos-UFSCar, Graduate Program in Chemical Engineering, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Masunaga, Sueli H., E-mail: sueli.masunaga@gmail.com [Department of Physics, Montana State University-MSU, 173840, Bozeman, MT 59717-3840 (United States); Tardioli, Paulo W., E-mail: pwtardioli@ufscar.br [Department of Chemical Engineering, Federal University of São Carlos-UFSCar, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Giordano, Roberto C., E-mail: roberto@ufscar.br [Department of Chemical Engineering, Federal University of São Carlos-UFSCar, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); Araújo-Moreira, Fernando M., E-mail: faraujo@df.ufscar.br [Department of Physics, Federal University of São Carlos-UFSCar, Rodovia Washington Luiz, km 235, São Carlos, São Paulo 13565-905 (Brazil); and others

    2015-03-15

    Highlights: • Silica magnetic microparticles were synthesized for applications in bioprocesses. • The process to produce magnetic microparticles is inexpensive and easily scalable. • Microparticles with very high saturation magnetization were obtained. • The structure of the silica magnetic microparticles could be controlled. - Abstract: Magnetic carriers are an effective option to withdraw selected target molecules from complex mixtures or to immobilize enzymes. This paper describes the synthesis of robust silica magnetic microparticles (SMMps), particularly designed for applications in bioprocesses. SMMps were synthesized in a micro-emulsion, using sodium silicate as the silica source and superparamagnetic iron oxide nanoparticles as the magnetic core. Thermally resistant particles, with high and accessible surface area, narrow particle size distribution, high saturation magnetization, and with superparamagnetic properties were obtained. Several reaction conditions were tested, yielding materials with saturation magnetization between 45 and 63 emu g{sup −1}, particle size between 2 and 200 μm and average diameter between 11.2 and 15.9 μm, surface area between 49 and 103 m{sup 2} g{sup −1} and pore diameter between 2 and 60 nm. The performance of SMMps in a bioprocess was evaluated by the immobilization of Pseudomonas fluorescens lipase on to octyl modified SMMp, the biocatalyst obtained was used in the production of butyl butyrate with good results.

  9. Design of digital learning material for bioprocess-engineering-education

    NARCIS (Netherlands)

    Schaaf, van der H.

    2007-01-01

    With the advance of computers and the internet, new types of learning material can be developed: web-based digital learning material. Because many complex learning objectives in the food- and bioprocess technology domain are difficult to achieve in a traditional learning environment, a project was

  10. Therapeutic antibodies: market considerations, disease targets and bioprocessing.

    Science.gov (United States)

    Elvin, John G; Couston, Ruairidh G; van der Walle, Christopher F

    2013-01-02

    Antibodies are well established in mainstream clinical practice and present an exciting area for collaborative research and development in industry and academia alike. In this review, we will provide an overview of the current market and an outlook to 2015, focussing on whole antibody molecules while acknowledging the next generation scaffolds containing variable fragments. The market will be discussed in the context of disease targets, particularly in the areas of oncology and immune disorders which generate the greatest revenue by a wide margin. Emerging targets include central nervous system disorders which will also stimulate new delivery strategies. It is becoming increasingly apparent that a better understanding of bioprocessing is required in order to optimize the steps involved in the preparation of a protein prior to formulation. The latter is outside the scope of this review and nor is it our intention to discuss protein delivery and pharmacokinetics. The challenges that lie ahead include the discovery of new disease targets and the development of robust bioprocessing operations. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  11. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    Science.gov (United States)

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Application of agent-based system for bioprocess description and process improvement.

    Science.gov (United States)

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which

  13. Systematic Development of Miniaturized (Bio)Processes using Process Systems Engineering (PSE) Methods and Tools

    DEFF Research Database (Denmark)

    Krühne, Ulrich; Larsson, Hilde; Heintz, Søren

    2014-01-01

    The focus of this work is on process systems engineering (PSE) methods and tools, and especially on how such PSE methods and tools can be used to accelerate and support systematic bioprocess development at a miniature scale. After a short presentation of the PSE methods and the bioprocess...... development drivers, three case studies are presented. In the first example it is demonstrated how experimental investigations of the bi-enzymatic production of lactobionic acid can be modeled with help of a new mechanistic mathematical model. The reaction was performed at lab scale and the prediction quality...

  14. Monoliths in Bioprocess Technology

    Directory of Open Access Journals (Sweden)

    Vignesh Rajamanickam

    2015-04-01

    Full Text Available Monolithic columns are a special type of chromatography column, which can be used for the purification of different biomolecules. They have become popular due to their high mass transfer properties and short purification times. Several articles have already discussed monolith manufacturing, as well as monolith characteristics. In contrast, this review focuses on the applied aspect of monoliths and discusses the most relevant biomolecules that can be successfully purified by them. We describe success stories for viruses, nucleic acids and proteins and compare them to conventional purification methods. Furthermore, the advantages of monolithic columns over particle-based resins, as well as the limitations of monoliths are discussed. With a compilation of commercially available monolithic columns, this review aims at serving as a ‘yellow pages’ for bioprocess engineers who face the challenge of purifying a certain biomolecule using monoliths.

  15. Production-process optimization algorithm: Application to fed-batch bioprocess

    Czech Academy of Sciences Publication Activity Database

    Pčolka, M.; Čelikovský, Sergej

    2017-01-01

    Roč. 354, č. 18 (2017), s. 8529-8551 ISSN 0016-0032 R&D Projects: GA ČR(CZ) GA17-04682S Institutional support: RVO:67985556 Keywords : Optimal control * Bioprocess * Optimization Subject RIV: BC - Control Systems Theory OBOR OECD: Automation and control systems Impact factor: 3.139, year: 2016 https://doi.org/10.1016/j.jfranklin.2017.10.012

  16. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Bioprocessing of ores: Application to space resources

    Science.gov (United States)

    Johansson, Karl R.

    1992-01-01

    The role of microorganisms in the oxidation and leaching of various ores (especially those of copper, iron, and uranium) is well known. This role is increasingly being applied by the mining, metallurgy, and sewage industries in the bioconcentration of metal ions from natural receiving waters and from waste waters. It is concluded that bioprocessing using bacteria in closed reactors may be a variable option for the recovery of metals from the lunar regolith. Obviously, considerable research must be done to define the process, specify the appropriate bacteria, determine the necessary conditions and limitations, and evaluate the overall feasibility.

  18. Bioprocessing of wheat bran in whole wheat bread increases the bioavailability of phenolic acids in men and exerts antiinflammatory effects ex vivo.

    Science.gov (United States)

    Mateo Anson, Nuria; Aura, Anna-Marja; Selinheimo, Emilia; Mattila, Ismo; Poutanen, Kaisa; van den Berg, Robin; Havenaar, Robert; Bast, Aalt; Haenen, Guido R M M

    2011-01-01

    Whole grain consumption has been linked to a lower risk of metabolic syndrome, which is normally associated with a low-grade chronic inflammation. The benefits of whole grain are in part related to the inclusion of the bran, rich in phenolic acids and fiber. However, the phenols are poorly bioaccessible from the cereal matrix. The aim of the present study was to investigate the effect of bioprocessing of the bran in whole wheat bread on the bioavailability of phenolic acids, the postprandial plasma antioxidant capacity, and ex vivo antiinflammatory properties. After consumption of a low phenolic acid diet for 3 d and overnight fasting, 8 healthy men consumed 300 g of whole wheat bread containing native bran (control bread) or bioprocessed bran (bioprocessed bread) in a cross-over design. Urine and blood samples were collected for 24 h to analyze the phenolic acids and metabolites. Trolox equivalent antioxidant capacity was measured in plasma. Cytokines were measured in blood after ex vivo stimulation with LPS. The bioavailabilities of ferulic acid, vanillic acid, sinapic acid, and 3,4-dimethoxybenzoic acid from the bioprocessed bread were 2- to 3-fold those from the control bread. Phenylpropionic acid and 3-hydroxyphenylpropionic acid were the main colonic metabolites of the nonbioaccessible phenols. The ratios of pro-:antiinflammatory cytokines were significantly lower in LPS-stimulated blood after the consumption of the bioprocessed bread. In conclusion, bioprocessing can remarkably increase the bioavailability of phenolic acids and their circulating metabolites, compounds which have immunomodulatory effects ex vivo.

  19. A one-step bioprocess for production of high-content fructo-oligosaccharides from inulin by yeast.

    Science.gov (United States)

    Wang, Da; Li, Fu-Li; Wang, Shi-An

    2016-10-20

    Commercial fructo-oligosaccharides (FOS) are predominantly produced from sucrose by transfructosylation process that presents a maximum theoretical yield below 0.60gFOSgSucrose(-1). To obtain high-content FOS, costly purification is generally employed. Additionally, high-content FOS can be produced from inulin by using endo-inulinases. However, commercial endo-inulinases have not been extensively used in scale-up production of FOS. In the present study, a one-step bioprocess that integrated endo-inulinase production, FOS fermentation, and non-FOS sugars removal into one reactor was proposed to produce high-content FOS from inulin. The bioprocess was implemented by a recombinant yeast strain JZHΔS-TSC, in which a heterologous endo-inulinase gene was expressed and the inherent invertase gene SUC2 was disrupted. FOS fermentation at 40°C from 200g/L chicory inulin presented the maximun titer, yield, and productivity of 180.2±0.8g/L, 0.9gFOSgInulin(-1), and 7.51±0.03g/L/h, respectively. This study demonstrated that the one-step bioprocess was simple and highly efficient. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Hybrid elementary flux analysis/nonparametric modeling: application for bioprocess control

    Directory of Open Access Journals (Sweden)

    Alves Paula M

    2007-01-01

    Full Text Available Abstract Background The progress in the "-omic" sciences has allowed a deeper knowledge on many biological systems with industrial interest. This knowledge is still rarely used for advanced bioprocess monitoring and control at the bioreactor level. In this work, a bioprocess control method is presented, which is designed on the basis of the metabolic network of the organism under consideration. The bioprocess dynamics are formulated using hybrid rigorous/data driven systems and its inherent structure is defined by the metabolism elementary modes. Results The metabolic network of the system under study is decomposed into elementary modes (EMs, which are the simplest paths able to operate coherently in steady-state. A reduced reaction mechanism in the form of simplified reactions connecting substrates with end-products is obtained. A dynamical hybrid system integrating material balance equations, EMs reactions stoichiometry and kinetics was formulated. EMs kinetics were defined as the product of two terms: a mechanistic/empirical known term and an unknown term that must be identified from data, in a process optimisation perspective. This approach allows the quantification of fluxes carried by individual elementary modes which is of great help to identify dominant pathways as a function of environmental conditions. The methodology was employed to analyse experimental data of recombinant Baby Hamster Kidney (BHK-21A cultures producing a recombinant fusion glycoprotein. The identified EMs kinetics demonstrated typical glucose and glutamine metabolic responses during cell growth and IgG1-IL2 synthesis. Finally, an online optimisation study was conducted in which the optimal feeding strategies of glucose and glutamine were calculated after re-estimation of model parameters at each sampling time. An improvement in the final product concentration was obtained as a result of this online optimisation. Conclusion The main contribution of this work is a

  1. Bioprocessing of lignite coals using reductive microorganisms

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D.L.

    1992-03-29

    In order to convert lignite coals into liquid fuels, gases or chemical feedstock, the macromolecular structure of the coal must be broken down into low molecular weight fractions prior to further modification. Our research focused on this aspect of coal bioprocessing. We isolated, characterized and studied the lignite coal-depolymerizing organisms Streptomyces viridosporus T7A, Pseudomonas sp. DLC-62, unidentified bacterial strain DLC-BB2 and Gram-positive Bacillus megaterium strain DLC-21. In this research we showed that these bacteria are able to solubilize and depolymerize lignite coals using a combination of biological mechanisms including the excretion of coal solublizing basic chemical metabolites and extracellular coal depolymerizing enzymes.

  2. Analytic Methods for Evaluating Patterns of Multiple Congenital Anomalies in Birth Defect Registries.

    Science.gov (United States)

    Agopian, A J; Evans, Jane A; Lupo, Philip J

    2018-01-15

    It is estimated that 20 to 30% of infants with birth defects have two or more birth defects. Among these infants with multiple congenital anomalies (MCA), co-occurring anomalies may represent either chance (i.e., unrelated etiologies) or pathogenically associated patterns of anomalies. While some MCA patterns have been recognized and described (e.g., known syndromes), others have not been identified or characterized. Elucidating these patterns may result in a better understanding of the etiologies of these MCAs. This article reviews the literature with regard to analytic methods that have been used to evaluate patterns of MCAs, in particular those using birth defect registry data. A popular method for MCA assessment involves a comparison of the observed to expected ratio for a given combination of MCAs, or one of several modified versions of this comparison. Other methods include use of numerical taxonomy or other clustering techniques, multiple regression analysis, and log-linear analysis. Advantages and disadvantages of these approaches, as well as specific applications, were outlined. Despite the availability of multiple analytic approaches, relatively few MCA combinations have been assessed. The availability of large birth defects registries and computing resources that allow for automated, big data strategies for prioritizing MCA patterns may provide for new avenues for better understanding co-occurrence of birth defects. Thus, the selection of an analytic approach may depend on several considerations. Birth Defects Research 110:5-11, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Design-for-Six-Sigma To Develop a Bioprocess Knowledge Management Framework.

    Science.gov (United States)

    Junker, Beth; Maheshwari, Gargi; Ranheim, Todd; Altaras, Nedim; Stankevicz, Michael; Harmon, Lori; Rios, Sandra; D'anjou, Marc

    2011-01-01

    Owing to the high costs associated with biopharmaceutical development, considerable pressure has developed for the biopharmaceutical industry to increase productivity by becoming more lean and flexible. The ability to reuse knowledge was identified as one key advantage to streamline productivity, efficiently use resources, and ultimately perform better than the competition. A knowledge management (KM) strategy was assembled for bioprocess-related information using the technique of Design-for-Six-Sigma (DFSS). This strategy supported quality-by-design and process validation efforts for pipeline as well as licensed products. The DFSS technique was selected because it was both streamlined and efficient. These characteristics permitted development of a KM strategy with minimized team leader and team member resources. DFSS also placed a high emphasis on the voice of the customer, information considered crucial to the selection of solutions most appropriate for the current knowledge-based challenges of the organization. The KM strategy developed was comprised of nine workstreams, constructed from related solution buckets which in turn were assembled from the individual solution tasks that were identified. Each workstream's detailed design was evaluated against published and established best practices, as well as the KM strategy project charter and design inputs. Gaps and risks were identified and mitigated as necessary to improve the robustness of the proposed strategy. Aggregated resources (specifically expense/capital funds and staff) and timing were estimated to obtain vital management sponsorship for implementation. Where possible, existing governance and divisional/corporate information technology efforts were leveraged to minimize the additional bioprocess resources required for implementation. Finally, leading and lagging indicator metrics were selected to track the success of pilots and eventual implementation. A knowledge management framework was assembled for

  4. Mannheimia haemolytica growth and leukotoxin production for vaccine manufacturing — A bioprocess review

    Directory of Open Access Journals (Sweden)

    Tobias Oppermann

    2017-07-01

    Full Text Available Mannheimia haemolytica leukotoxin (LKT is a known cause of bovine respiratory disease (BRD which results in severe economic losses in the cattle industry (up to USD 1 billion per year in the USA. Vaccines based on LKT offer the most promising measure to contain BRD outbreaks and are already commercially available. However, insufficient LKT yields, predominantly reflecting a lack of knowledge about the LKT expression process, remain a significant engineering problem and further bioprocess optimization is required to increase process efficiency. Most previous investigations have focused on LKT activity and cell growth, but neither of these parameters defines reliable criteria for the improvement of LKT yields. In this article, we review the most important process conditions and operational parameters (temperature, pH, substrate concentration, dissolved oxygen level, medium composition and the presence of metabolites from a bioprocess engineering perspective, in order to maximize LKT yields.

  5. To Stretch the Boundary of Secondary Metabolite Production in Plant Cell-Based Bioprocessing: Anthocyanin as a Case Study

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2004-01-01

    Full Text Available Plant cells and tissue cultures hold great promise for controlled production of a myriad of useful secondary metabolites on demand. The current yield and productivity cannot fulfill the commercial goal of a plant cell-based bioprocess for the production of most secondary metabolites. In order to stretch the boundary, recent advances, new directions and opportunities in plant cell-based bioprocessing, have been critically examined for the 10 years from 1992 to 2002. A review of the literature indicated that most of the R&D work was devoted predominantly to studies at an empirical level. A rational approach to molecular plant cell bioprocessing based on the fundamental understanding of metabolic pathways and their regulations is urgently required to stimulate further advances; however, the strategies and technical framework are still being developed. It is the aim of this review to take a step forward in framing workable strategies and technologies for molecular plant cell-based bioprocessing. Using anthocyanin biosynthesis as a case study, an integrated postgenomic approach has been proposed. This combines the functional analysis of metabolic pathways for biosynthesis of a particular metabolite from profiling of gene expression and protein expression to metabolic profiling. A global correlation not only can thus be established at the three molecular levels, but also places emphasis on the interactions between primary metabolism and secondary metabolism; between competing and/or complimentary pathways; and between biosynthetic and post-biosynthetic events.

  6. ADAPTIVE HIGH GAIN OBSERVER EXTENSION AND ITS APPLICATION TO BIOPROCESS MONITORING

    Czech Academy of Sciences Publication Activity Database

    Čelikovský, Sergej; Torres-Munoz, J. A.; Dominguez-Bocanegra, A. R.

    2018-01-01

    Roč. 54, č. 1 (2018), s. 155-174 ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Adaptive observers * nonlinear systems * bioprocess Subject RIV: BC - Control Systems Theory OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.379, year: 2016 http://doi.org/10.14736/kyb-2018-1-0155

  7. Effect of Bioprocessing on the In Vitro Colonic Microbial Metabolism of Phenolic Acids from Rye Bran Fortified Breads

    DEFF Research Database (Denmark)

    Koistinen, Ville M; Nordlund, Emilia; Katina, Kati

    2017-01-01

    in an in vitro colon model, the metabolites were analyzed using two different methods applying mass spectrometry. While phenolic acids were released more extensively from the bioprocessed bran bread and ferulic acid had consistently higher concentrations in the bread type during fermentation, there were only......Cereal bran is an important source of dietary fiber and bioactive compounds, such as phenolic acids. We aimed to study the phenolic acid metabolism of native and bioprocessed rye bran fortified refined wheat bread and to elucidate the microbial metabolic route of phenolic acids. After incubation...

  8. Development of a new bioprocess scheme using frozen seed train intermediates to initiate CHO cell culture manufacturing campaigns.

    Science.gov (United States)

    Seth, Gargi; Hamilton, Robert W; Stapp, Thomas R; Zheng, Lisa; Meier, Angela; Petty, Krista; Leung, Stephenie; Chary, Srikanth

    2013-05-01

    Agility to schedule and execute cell culture manufacturing campaigns quickly in a multi-product facility will play a key role in meeting the growing demand for therapeutic proteins. In an effort to shorten campaign timelines, maximize plant flexibility and resource utilization, we investigated the initiation of cell culture manufacturing campaigns using CHO cells cryopreserved in large volume bags in place of the seed train process flows that are conventionally used in cell culture manufacturing. This approach, termed FASTEC (Frozen Accelerated Seed Train for Execution of a Campaign), involves cultivating cells to high density in a perfusion bioreactor, and cryopreserving cells in multiple disposable bags. Each run for a manufacturing campaign would then come from a thaw of one or more of these cryopreserved bags. This article reviews the development and optimization of individual steps of the FASTEC bioprocess scheme: scaling up cells to greater than 70 × 10(6) cells/mL and freezing in bags with an optimized controlled rate freezing protocol and using a customized rack configuration. Flow cytometry analysis was also employed to understand the recovery of CHO cells following cryopreservation. Extensive development data were gathered to ensure that the quantity and quality of the drug manufactured using the FASTEC bioprocess scheme was acceptable compared to the conventional seed train process flow. The result of offering comparable manufacturing options offers flexibility to the cell culture manufacturing network. Copyright © 2012 Wiley Periodicals, Inc.

  9. Teaching bioprocess engineering to undergraduates: Multidisciplinary hands-on training in a one-week practical course.

    Science.gov (United States)

    Henkel, Marius; Zwick, Michaela; Beuker, Janina; Willenbacher, Judit; Baumann, Sandra; Oswald, Florian; Neumann, Anke; Siemann-Herzberg, Martin; Syldatk, Christoph; Hausmann, Rudolf

    2015-01-01

    Bioprocess engineering is a highly interdisciplinary field of study which is strongly benefited by practical courses where students can actively experience the interconnection between biology, engineering, and physical sciences. This work describes a lab course developed for 2nd year undergraduate students of bioprocess engineering and related disciplines, where students are challenged with a real-life bioprocess-engineering application, the production of recombinant protein in a fed-batch process. The lab course was designed to introduce students to the subject of operating and supervising an experiment in a bioreactor, along with the analysis of collected data and a final critical evaluation of the experiment. To provide visual feedback of the experimental outcome, the organism used during class was Escherichia coli which carried a plasmid to recombinantly produce enhanced green fluorescent protein (eGFP) upon induction. This can easily be visualized in both the bioreactor and samples by using ultraviolet light. The lab course is performed with bioreactors of the simplest design, and is therefore highly flexible, robust and easy to reproduce. As part of this work the implementation and framework, the results, the evaluation and assessment of student learning combined with opinion surveys are presented, which provides a basis for instructors intending to implement a similar lab course at their respective institution. © 2015 by the International Union of Biochemistry and Molecular Biology.

  10. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  11. BIOPROCESS DEVELOPMENTS FOR CELLULASE PRODUCTION BY Aspergillus oryzae CULTIVATED UNDER SOLID-STATE FERMENTATION

    Directory of Open Access Journals (Sweden)

    R. D. P. B. Pirota

    Full Text Available Abstract Bioprocess development studies concerning the production of cellulases are of crucial importance due to the significant impact of these enzymes on the economics of biomass conversion into fuels and chemicals. This work evaluates the effects of solid-state fermentation (SSF operational conditions on cellulase production by a novel strain of Aspergillus oryzae using an instrumented lab-scale bioreactor equipped with an on-line automated monitoring and control system. The use of SSF cultivation under controlled conditions substantially improved cellulase production. Highest production of FPase (0.40 IU g-1, endoglucanase (123.64 IU g-1, and β-glucosidase (18.32 IU g-1 was achieved at 28 °C, using an initial substrate moisture content of 70%, with an inlet air humidity of 80% and an airflow rate of 20 mL min-1. Further studies of kinetic profiles and respirometric analyses were performed. The results showed that these data could be very useful for bioprocess development of cellulase production and scale-up.

  12. Electromagnetic imaging of multiple-scattering small objects: non-iterative analytical approach

    International Nuclear Information System (INIS)

    Chen, X; Zhong, Y

    2008-01-01

    Multiple signal classification (MUSIC) imaging method and the least squares method are applied to solve the electromagnetic inverse scattering problem of determining the locations and polarization tensors of a collection of small objects embedded in a known background medium. Based on the analysis of induced electric and magnetic dipoles, the proposed MUSIC method is able to deal with some special scenarios, due to the shapes and materials of objects, to which the standard MUSIC doesn't apply. After the locations of objects are obtained, the nonlinear inverse problem of determining the polarization tensors of objects accounting for multiple scattering between objects is solved by a non-iterative analytical approach based on the least squares method

  13. Bioprocessing of wheat bran improves in vitro bioaccessibility and colonic metabolism of phenolic compounds

    NARCIS (Netherlands)

    Mateo Anson, N.; Selinheimo, E.; Havenaar, R.; Aura, A.-M.; Mattila, I.; Lehtinen, P.; Bast, A.; Poutanen, K.; Haenen, G.R.M.M.

    2009-01-01

    Ferulic acid (FA) is the most abundant phenolic compound in wheat grain, mainly located in the bran. However, its bioaccessibility from the bran matrix is extremely low. Different bioprocessing techniques involving fermentation or enzymatic and fermentation treatments of wheat bran were developed

  14. A Monte Carlo evaluation of analytical multiple scattering corrections for unpolarised neutron scattering and polarisation analysis data

    International Nuclear Information System (INIS)

    Mayers, J.; Cywinski, R.

    1985-03-01

    Some of the approximations commonly used for the analytical estimation of multiple scattering corrections to thermal neutron elastic scattering data from cylindrical and plane slab samples have been tested using a Monte Carlo program. It is shown that the approximations are accurate for a wide range of sample geometries and scattering cross-sections. Neutron polarisation analysis provides the most stringent test of multiple scattering calculations as multiply scattered neutrons may be redistributed not only geometrically but also between the spin flip and non spin flip scattering channels. A very simple analytical technique for correcting for multiple scattering in neutron polarisation analysis has been tested using the Monte Carlo program and has been shown to work remarkably well in most circumstances. (author)

  15. On the multiple zeros of a real analytic function with applications to the averaging theory of differential equations

    Science.gov (United States)

    García, Isaac A.; Llibre, Jaume; Maza, Susanna

    2018-06-01

    In this work we consider real analytic functions , where , Ω is a bounded open subset of , is an interval containing the origin, are parameters, and ε is a small parameter. We study the branching of the zero-set of at multiple points when the parameter ε varies. We apply the obtained results to improve the classical averaging theory for computing T-periodic solutions of λ-families of analytic T-periodic ordinary differential equations defined on , using the displacement functions defined by these equations. We call the coefficients in the Taylor expansion of in powers of ε the averaged functions. The main contribution consists in analyzing the role that have the multiple zeros of the first non-zero averaged function. The outcome is that these multiple zeros can be of two different classes depending on whether the zeros belong or not to the analytic set defined by the real variety associated to the ideal generated by the averaged functions in the Noetheriang ring of all the real analytic functions at . We bound the maximum number of branches of isolated zeros that can bifurcate from each multiple zero z 0. Sometimes these bounds depend on the cardinalities of minimal bases of the former ideal. Several examples illustrate our results and they are compared with the classical theory, branching theory and also under the light of singularity theory of smooth maps. The examples range from polynomial vector fields to Abel differential equations and perturbed linear centers.

  16. Aptamer- and nucleic acid enzyme-based systems for simultaneous detection of multiple analytes

    Science.gov (United States)

    Lu, Yi [Champaign, IL; Liu, Juewen [Albuquerque, NM

    2011-11-15

    The present invention provides aptamer- and nucleic acid enzyme-based systems for simultaneously determining the presence and optionally the concentration of multiple analytes in a sample. Methods of utilizing the system and kits that include the sensor components are also provided. The system includes a first reactive polynucleotide that reacts to a first analyte; a second reactive polynucleotide that reacts to a second analyte; a third polynucleotide; a fourth polynucleotide; a first particle, coupled to the third polynucleotide; a second particle, coupled to the fourth polynucleotide; and at least one quencher, for quenching emissions of the first and second quantum dots, coupled to the first and second reactive polynucleotides. The first particle includes a quantum dot having a first emission wavelength. The second particle includes a second quantum dot having a second emission wavelength different from the first emission wavelength. The third polynucleotide and the fourth polynucleotide are different.

  17. Bioprocessing of concentrated mixed hazardous industrial waste

    International Nuclear Information System (INIS)

    Wolfram, J.H.; Rogers, R.D.; Silver, G.; Attalla, A.; Prisc, M.

    1994-01-01

    The use of selected microorganisms for the degradation and/or the detoxification of hazardous organic compounds is gaining wide acceptance as an alternative waste treatment technology. This work describes the unique capabilities of an isolated strain of Pseudomonas for metabolizing methylated aromatic compounds. This strain of Pseudomonas putida Idaho is unique in that it can tolerate and grow under a layer of neat p-xylene. A bioprocess has been developed to degrade LLW and mixed wastes containing methylated aromatic compounds, i.e., pseudocumene, toluene and p-xylene. The process is now in the demonstration phase at a DOE facility and has been running for one year. Feed concentrations of 21200 ppm of the toxic organic substrate have been fed to the bioreactor. This report describes the results obtained thus far

  18. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  19. Cleaner bioprocesses for promoting zero-emission biofuels production in Vojvodina

    Energy Technology Data Exchange (ETDEWEB)

    Dodic, Sinisa N.; Vucurovic, Damjan G.; Popov, Stevan D.; Dodic, Jelena M.; Rankovic, Jovana A. [Department of Biotechnology and Pharmaceutical Engineering, Faculty of Technology, University of Novi Sad, Bul. cara Lazara 1, Novi Sad 21000, Vojvodina (RS)

    2010-12-15

    In this study, the policy, market conditions and food security of biomass energy sources are assessed for supplying the future needs of Vojvodina. The Autonomous Province of Vojvodina is an autonomous province in Serbia, containing about 27% of its total population according to the 2002 Census. It is located in the northern part of the country, in the Pannonia plain, in southeastern Europe. Vojvodina is an energy-deficient province. The incentives to invest human and financial resources in the research and development of cleaner bioprocesses are high, considering the benefits which might be achieved in terms of environment protection and manufacturing costs. In the near and medium tenu, the development of bioprocesses for waste recycling and resource recovery might be one of the most viable options, considering much research work has already been done. In Vojvodina, there are technological solutions that biofuels produced in a closed cycle, so that the quantity of waste reduced to a minimum. These solutions include the stillage (remainder after distillation) used for fattening cattle, and cattle excrement to produce biogas and manure as fertilizer. The energy required for the production of bioethanol is obtained combustion lignocelullose residual waste from the production of basic raw materials starch, or biogas. Ash from the burned biomass returned to soil as a source of minerals for plants and replacement of mineral fertilizer. Such a closed cycle is economical for small farms in Vojvodina. (author)

  20. Cleaner bioprocesses for promoting zero-emission biofuels production in Vojvodina

    International Nuclear Information System (INIS)

    Dodic, Sinisa N.; Vucurovic, Damjan G.; Popov, Stevan D.; Dodic, Jelena M.; Rankovic, Jovana A.

    2010-01-01

    In this study, the policy, market conditions and food security of biomass energy sources are assessed for supplying the future needs of Vojvodina. The Autonomous Province of Vojvodina is an autonomous province in Serbia, containing about 27% of its total population according to the 2002 Census. It is located in the northern part of the country, in the Pannonia plain, in southeastern Europe. Vojvodina is an energy-deficient province. The incentives to invest human and financial resources in the research and development of cleaner bioprocesses are high, considering the benefits which might be achieved in terms of environment protection and manufacturing costs. In the near and medium tenu, the development of bioprocesses for waste recycling and resource recovery might be one of the most viable options, considering much research work has already been done. In Vojvodina, there are technological solutions that biofuels produced in a closed cycle, so that the quantity of waste reduced to a minimum. These solutions include the stillage (remainder after distillation) used for fattening cattle, and cattle excrement to produce biogas and manure as fertilizer. The energy required for the production of bioethanol is obtained combustion lignocelullose residual waste from the production of basic raw materials starch, or biogas. Ash from the burned biomass returned to soil as a source of minerals for plants and replacement of mineral fertilizer. Such a closed cycle is economical for small farms in Vojvodina. (author)

  1. Developing semi-analytical solution for multiple-zone transient storage model with spatially non-uniform storage

    Science.gov (United States)

    Deng, Baoqing; Si, Yinbing; Wang, Jia

    2017-12-01

    Transient storages may vary along the stream due to stream hydraulic conditions and the characteristics of storage. Analytical solutions of transient storage models in literature didn't cover the spatially non-uniform storage. A novel integral transform strategy is presented that simultaneously performs integral transforms to the concentrations in the stream and in storage zones by using the single set of eigenfunctions derived from the advection-diffusion equation of the stream. The semi-analytical solution of the multiple-zone transient storage model with the spatially non-uniform storage is obtained by applying the generalized integral transform technique to all partial differential equations in the multiple-zone transient storage model. The derived semi-analytical solution is validated against the field data in literature. Good agreement between the computed data and the field data is obtained. Some illustrative examples are formulated to demonstrate the applications of the present solution. It is shown that solute transport can be greatly affected by the variation of mass exchange coefficient and the ratio of cross-sectional areas. When the ratio of cross-sectional areas is big or the mass exchange coefficient is small, more reaches are recommended to calibrate the parameter.

  2. Analytical multiple scattering correction to the Mie theory: Application to the analysis of the lidar signal

    Science.gov (United States)

    Flesia, C.; Schwendimann, P.

    1992-01-01

    The contribution of the multiple scattering to the lidar signal is dependent on the optical depth tau. Therefore, the radar analysis, based on the assumption that the multiple scattering can be neglected is limited to cases characterized by low values of the optical depth (tau less than or equal to 0.1) and hence it exclude scattering from most clouds. Moreover, all inversion methods relating lidar signal to number densities and particle size must be modified since the multiple scattering affects the direct analysis. The essential requests of a realistic model for lidar measurements which include the multiple scattering and which can be applied to practical situations follow. (1) Requested are not only a correction term or a rough approximation describing results of a certain experiment, but a general theory of multiple scattering tying together the relevant physical parameter we seek to measure. (2) An analytical generalization of the lidar equation which can be applied in the case of a realistic aerosol is requested. A pure analytical formulation is important in order to avoid the convergency and stability problems which, in the case of numerical approach, are due to the large number of events that have to be taken into account in the presence of large depth and/or a strong experimental noise.

  3. Analytical properties of multiple production amplitudes

    Energy Technology Data Exchange (ETDEWEB)

    Medvedev, B V; Pavlov, V P; Polivanov, M K; Sukhanov, A D [Gosudarstvennyj Komitet po Ispol' zovaniyu Atomnoj Ehnergii SSSR, Moscow. Inst. Teoreticheskoj i Ehksperimental' noj Fiziki; AN SSSR, Moscow. Matematicheskij Inst.)

    1984-05-01

    Local analytical properties of amplitudes 2..-->..3 and 2..-->..4 are studied. The amplitudes are shown to be analytical functions of total and partial energies at fixed momentum transfers in the neighbourhood of any physical point on the energy shell 14 (for the 2..-->..3 case) and 242 (for the 2..-->..4 case) boundary values are expressed through the amplitudes of real processes.

  4. Origins of Cell-to-Cell Bioprocessing Diversity and Implications of the Extracellular Environment Revealed at the Single-Cell Level.

    Science.gov (United States)

    Vasdekis, A E; Silverman, A M; Stephanopoulos, G

    2015-12-14

    Bioprocess limitations imposed by microbial cell-to-cell phenotypic diversity remain poorly understood. To address this, we investigated the origins of such culture diversity during lipid production and assessed the impact of the fermentation microenvironment. We measured the single-cell lipid production dynamics in a time-invariant microfluidic environment and discovered that production is not monotonic, but rather sporadic with time. To characterize this, we introduce bioprocessing noise and identify its epigenetic origins. We linked such intracellular production fluctuations with cell-to-cell productivity diversity in culture. This unmasked the phenotypic diversity amplification by the culture microenvironment, a critical parameter in strain engineering as well as metabolic disease treatment.

  5. Turmeric Bioprocessed with Mycelia from the Shiitake Culinary-Medicinal Mushroom Lentinus edodes (Agaricomycetes) Protects Mice Against Salmonellosis.

    Science.gov (United States)

    Kim, Sung Phil; Lee, Sang Jong; Nam, Seok Hyun; Friedman, Mendel

    2017-01-01

    This study investigated the suppressive mechanisms of an extract from bioprocessed Lentinus edodes mycelial liquid culture supplemented with turmeric (bioprocessed Curcuma longa extract [BPCLE]) against murine salmonellosis. The BPLCE extract from the bioprocessed mycelia of the Salmonella Typhimurium into murine RAW 264.7 macrophage cells, elimination of intracellular bacteria, and elevation of inducible nitric oxide synthase expression. Dietary administration of BPCLE activated leukocytes from the mice infected with Salmonella through the intraperitoneal route. The enzyme-linked immunosorbent assay of the cytokines produced by splenocytes from infected mice showed significant increases in the levels of Th1 cytokines, including interleukin (IL)-1β, IL-2, IL-6, and IL-12. Histology showed that dietary administration of BPCLE protected against necrosis of the liver resulting from a sublethal dose of Salmonella. In addition, the treatment (1) extended the lifespan of lethally infected mice, (2) suppressed the invasion of Salmonella into human Caco-2 colorectal adenocarcinoma cells, (3) increased excretion of the bacterium in the feces, (4) suppressed the translocation of the Salmonella to internal organs, and (5) increased total immunoglobulin A in both serum and intestinal fluids. BPCLE protected the mice against salmonellosis via cooperative effects that include the upregulation of the Th1 immune reaction, prevention of translocation of bacteria across the intestinal epithelial cells, and increased immunoglobulin A production in serum and intestinal fluids.

  6. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  7. Bioprocessing strategies for the large-scale production of human mesenchymal stem cells: a review.

    Science.gov (United States)

    Panchalingam, Krishna M; Jung, Sunghoon; Rosenberg, Lawrence; Behie, Leo A

    2015-11-23

    Human mesenchymal stem cells (hMSCs), also called mesenchymal stromal cells, have been of great interest in regenerative medicine applications because of not only their differentiation potential but also their ability to secrete bioactive factors that can modulate the immune system and promote tissue repair. This potential has initiated many early-phase clinical studies for the treatment of various diseases, disorders, and injuries by using either hMSCs themselves or their secreted products. Currently, hMSCs for clinical use are generated through conventional static adherent cultures in the presence of fetal bovine serum or human-sourced supplements. However, these methods suffer from variable culture conditions (i.e., ill-defined medium components and heterogeneous culture environment) and thus are not ideal procedures to meet the expected future demand of quality-assured hMSCs for human therapeutic use. Optimizing a bioprocess to generate hMSCs or their secreted products (or both) promises to improve the efficacy as well as safety of this stem cell therapy. In this review, current media and methods for hMSC culture are outlined and bioprocess development strategies discussed.

  8. Consolidated bioprocessing for butyric acid production from rice straw with undefined mixed culture

    Directory of Open Access Journals (Sweden)

    Binling Ai

    2016-10-01

    Full Text Available Lignocellulosic biomass is a renewable source with great potential for biofuels and bioproducts. However, the cost of cellulolytic enzymes limits the utilization of the low-cost bioresource. This study aimed to develop a consolidated bioprocessing without the need of supplementary cellulase for butyric acid production from lignocellulosic biomass. A stirred-tank reactor with a working volume of 21 L was constructed and operated in batch and semi-continuous fermentation modes with a cellulolytic butyrate-producing microbial community. The semi-continuous fermentation with intermittent discharging of the culture broth and replenishment with fresh medium achieved the highest butyric acid productivity of 2.69 g/(L•d. In semi-continuous operation mode, the butyric acid and total carboxylic acid concentrations of 16.2 and 28.9 g/L, respectively, were achieved. Over the 21-day fermentation period, their cumulative yields reached 1189 and 2048 g, respectively, corresponding to 41% and 74% of the maximum theoretical yields based on the amount of NaOH pretreated rice straw fed in. This study demonstrated that an undefined mixed culture-based consolidated bioprocessing for butyric acid production can completely eliminate the cost of supplementary cellulolytic enzymes.

  9. Consolidated Bioprocessing for Butyric Acid Production from Rice Straw with Undefined Mixed Culture.

    Science.gov (United States)

    Ai, Binling; Chi, Xue; Meng, Jia; Sheng, Zhanwu; Zheng, Lili; Zheng, Xiaoyan; Li, Jianzheng

    2016-01-01

    Lignocellulosic biomass is a renewable source with great potential for biofuels and bioproducts. However, the cost of cellulolytic enzymes limits the utilization of the low-cost bioresource. This study aimed to develop a consolidated bioprocessing without the need of supplementary cellulase for butyric acid production from lignocellulosic biomass. A stirred-tank reactor with a working volume of 21 L was constructed and operated in batch and semi-continuous fermentation modes with a cellulolytic butyrate-producing microbial community. The semi-continuous fermentation with intermittent discharging of the culture broth and replenishment with fresh medium achieved the highest butyric acid productivity of 2.69 g/(L· d). In semi-continuous operation mode, the butyric acid and total carboxylic acid concentrations of 16.2 and 28.9 g/L, respectively, were achieved. Over the 21-day fermentation period, their cumulative yields reached 1189 and 2048 g, respectively, corresponding to 41 and 74% of the maximum theoretical yields based on the amount of NaOH pretreated rice straw fed in. This study demonstrated that an undefined mixed culture-based consolidated bioprocessing for butyric acid production can completely eliminate the cost of supplementary cellulolytic enzymes.

  10. MULTIPLE CRITERA METHODS WITH FOCUS ON ANALYTIC HIERARCHY PROCESS AND GROUP DECISION MAKING

    Directory of Open Access Journals (Sweden)

    Lidija Zadnik-Stirn

    2010-12-01

    Full Text Available Managing natural resources is a group multiple criteria decision making problem. In this paper the analytic hierarchy process is the chosen method for handling the natural resource problems. The one decision maker problem is discussed and, three methods: the eigenvector method, data envelopment analysis method, and logarithmic least squares method are presented for the derivation of the priority vector. Further, the group analytic hierarchy process is discussed and six methods for the aggregation of individual judgments or priorities: weighted arithmetic mean method, weighted geometric mean method, and four methods based on data envelopment analysis are compared. The case study on land use in Slovenia is applied. The conclusions review consistency, sensitivity analyses, and some future directions of research.

  11. Multiple Solutions of Nonlinear Boundary Value Problems of Fractional Order: A New Analytic Iterative Technique

    Directory of Open Access Journals (Sweden)

    Omar Abu Arqub

    2014-01-01

    Full Text Available The purpose of this paper is to present a new kind of analytical method, the so-called residual power series, to predict and represent the multiplicity of solutions to nonlinear boundary value problems of fractional order. The present method is capable of calculating all branches of solutions simultaneously, even if these multiple solutions are very close and thus rather difficult to distinguish even by numerical techniques. To verify the computational efficiency of the designed proposed technique, two nonlinear models are performed, one of them arises in mixed convection flows and the other one arises in heat transfer, which both admit multiple solutions. The results reveal that the method is very effective, straightforward, and powerful for formulating these multiple solutions.

  12. Production of polyol oils from soybean oil by bioprocess and Philippines edible medicinal wild mushrooms

    Science.gov (United States)

    We have been trying to develop a bioprocess for the production of polyol oils directly from soybean oil. We reported earlier the polyol products produced from soybean oil by Acinetobacter haemolyticus A01-35 (NRRL B-59985) (Hou and Lin, 2013). The objective of this study is to identify the chemical ...

  13. The Solubility of Cr-Organic Produced by Hydrolysis, Bioprocess and Bioremediation and its Effect on Fermented Rate, Digestibility and Rumen Microbe Population (in vitro

    Directory of Open Access Journals (Sweden)

    UH Tanuwiria

    2010-09-01

    Full Text Available The research was conducted to study the production of organic chromium from the leather tanning waste and its effect on in vitro rumen fermentation activities. The research was divided into two phases. The first phase was production of organic chromium by alkali hydrolysis, S cereviceae bioprocess, and duckweed bioremediation that perceived solubility in neutral and acid solution. The second phase was the supplementation of organic-Cr in ration seen from in-vitro fermented rate, digestibility and microbe rumen population. Research was conducted experimentally using 4x4 factorial patterns, on the basis of Completely Randomized Design (CRD with three replications in each experimental unit. The first factor was the type of organic-Cr and the second factor was the supplement in ration at four levels, 1, 2, 3 and 4 ppm. The results of this research indicated that organic chromium can be synthesized by alkali hydrolysis, S cereviseae bioprocess and the activity of duckweed bioremediation. Among the three of processes referred, the highest level of Cr was obtained from S cereviseae bioprocess that was originated from leather-tanning waste. The levels of organic-Cr that was resulted from alkali hydrolysis, bioprocess from Cl3Cr.6H2O, bioprocess from Cr leather-tanning waste, and from duckweed bioremediation were 354, 1011, 3833 and 310 mg/kg, respectively. Organic-Cr characteristic of each product has relatively similar in ferment ability, dry matter and organic matter digestibility and rumen ecosystem. There is an indication that dry matter and organic matter digestibility and rumen microbe population in ration that was added with organic Cr from alkali hydrolysis was higher than other supplements. (Animal Production 12(3: 175-183 (2010Key Words: organic-Cr, rumen fermentation activities, rumen microbe population

  14. Upgrading protein products using bioprocessing on agricultural crops

    DEFF Research Database (Denmark)

    Sulewska, Anna Maria; Sørensen, Jens Christian; Markedal, Keld Ejdrup

    to sustainability leads to a demand for plant protein products made from locally grown crops. Novel bioprocessing methods have been developed to generate protein products which are nutritious, readily available and do not generate hazardous waste. The processing focus has therefore been on developing protein......Due to increasing world population, higher average income, and changes in food preferences, there is a growing demand for proteins, especially novel plant-based protein sources, that can substitute animal proteins and supplement currently used soya proteins. Increased customer awareness......-enriched products with minimized content of antinutritional compounds. For every crop it is a challenge to obtain protein fractions with sufficient added value to make processing economically feasible. In this work we present the characterization of protein products developed in pilot scale using the novel...

  15. Turmeric bioprocessed with mycelia from the shiitake culinary-medicinal mushroom lentinus edodes (agaricomycetes) protects mice against salmonellosis

    Science.gov (United States)

    Extracts of the shiitake mushroom Lentinus edodes and the spice tumeric (Curcuma longa) have both been reported to have health-promoting properties. The present study investigated the suppressive mechanisms of a bioprocessed Lentinus edodes liquid mushroom mycelia culture supplemented with turmeric ...

  16. A method for determining the analytical form of a radionuclide depth distribution using multiple gamma spectrometry measurements

    Energy Technology Data Exchange (ETDEWEB)

    Dewey, Steven Clifford, E-mail: sdewey001@gmail.com [United States Air Force School of Aerospace Medicine, Occupational Environmental Health Division, Health Physics Branch, Radiation Analysis Laboratories, 2350 Gillingham Drive, Brooks City-Base, TX 78235 (United States); Whetstone, Zachary David, E-mail: zacwhets@umich.edu [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States); Kearfott, Kimberlee Jane, E-mail: kearfott@umich.edu [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States)

    2011-06-15

    When characterizing environmental radioactivity, whether in the soil or within concrete building structures undergoing remediation or decommissioning, it is highly desirable to know the radionuclide depth distribution. This is typically modeled using continuous analytical expressions, whose forms are believed to best represent the true source distributions. In situ gamma ray spectroscopic measurements are combined with these models to fully describe the source. Currently, the choice of analytical expressions is based upon prior experimental core sampling results at similar locations, any known site history, or radionuclide transport models. This paper presents a method, employing multiple in situ measurements at a single site, for determining the analytical form that best represents the true depth distribution present. The measurements can be made using a variety of geometries, each of which has a different sensitivity variation with source spatial distribution. Using non-linear least squares numerical optimization methods, the results can be fit to a collection of analytical models and the parameters of each model determined. The analytical expression that results in the fit with the lowest residual is selected as the most accurate representation. A cursory examination is made of the effects of measurement errors on the method. - Highlights: > A new method for determining radionuclide distribution as a function of depth is presented. > Multiple measurements are used, with enough measurements to determine the unknowns in analytical functions that might describe the distribution. > The measurements must be as independent as possible, which is achieved through special collimation of the detector. > Although the effects of measurements errors may be significant on the results, an improvement over other methods is anticipated.

  17. Studies on generalized kinetic model and Pareto optimization of a product-driven self-cycling bioprocess.

    Science.gov (United States)

    Sun, Kaibiao; Kasperski, Andrzej; Tian, Yuan

    2014-10-01

    The aim of this study is the optimization of a product-driven self-cycling bioprocess and presentation of a way to determine the best possible decision variables out of a set of alternatives based on the designed model. Initially, a product-driven generalized kinetic model, which allows a flexible choice of the most appropriate kinetics is designed and analysed. The optimization problem is given as the bi-objective one, where maximization of biomass productivity and minimization of unproductive loss of substrate are the objective functions. Then, the Pareto fronts are calculated for exemplary kinetics. It is found that in the designed bioprocess, a decrease of emptying/refilling fraction and an increase of substrate feeding concentration cause an increase of the biomass productivity. An increase of emptying/refilling fraction and a decrease of substrate feeding concentration cause a decrease of unproductive loss of substrate. The preferred solutions are calculated using the minimum distance from an ideal solution method, while giving proposals of their modifications derived from a decision maker's reactions to the generated solutions.

  18. Technical Aspects of Use of Ultrasound for Intensification of Enzymatic Bio-Processing: New Path to "Green Chemistry"

    Science.gov (United States)

    Use of enzymatic processing in the food, textile, and bio-fuel applications is becoming increasingly popular, primarily because of rapid introduction of a new variety of highly efficient enzymes. In general, an enzymatic bio-processing generates less toxic and readily biodegradable wastewater efflue...

  19. Nano-tubular cellulose for bioprocess technology development.

    Science.gov (United States)

    Koutinas, Athanasios A; Sypsas, Vasilios; Kandylis, Panagiotis; Michelis, Andreas; Bekatorou, Argyro; Kourkoutas, Yiannis; Kordulis, Christos; Lycourghiotis, Alexis; Banat, Ibrahim M; Nigam, Poonam; Marchant, Roger; Giannouli, Myrsini; Yianoulis, Panagiotis

    2012-01-01

    Delignified cellulosic material has shown a significant promotional effect on the alcoholic fermentation as yeast immobilization support. However, its potential for further biotechnological development is unexploited. This study reports the characterization of this tubular/porous cellulosic material, which was done by SEM, porosimetry and X-ray powder diffractometry. The results showed that the structure of nano-tubular cellulose (NC) justifies its suitability for use in "cold pasteurization" processes and its promoting activity in bioprocessing (fermentation). The last was explained by a glucose pump theory. Also, it was demonstrated that crystallization of viscous invert sugar solutions during freeze drying could not be otherwise achieved unless NC was present. This effect as well as the feasibility of extremely low temperature fermentation are due to reduction of the activation energy, and have facilitated the development of technologies such as wine fermentations at home scale (in a domestic refrigerator). Moreover, NC may lead to new perspectives in research such as the development of new composites, templates for cylindrical nano-particles, etc.

  20. Comparative analysis of solid-state bioprocessing and enzymatic treatment of finger millet for mobilization of bound phenolics.

    Science.gov (United States)

    Yadav, Geetanjali; Singh, Anshu; Bhattacharya, Patrali; Yuvraj, Jude; Banerjee, Rintu

    2013-11-01

    The present work investigates the probable bioprocessing technique to mobilize the bound phenolics naturally found in finger millet cell wall for enriching it with dietary antioxidants. Comparative study was performed between the exogenous enzymatic treatment and solid-state fermentation of grain (SSF) with a food grade organism Rhizopus oryzae. SSF results indicated that at the 6th day of incubation, total phenolic content (18.64 mg gallic acid equivalent/gds) and antioxidant property (DPPH radical scavenging activity of 39.03 %, metal chelating ability of 54 % and better reducing power) of finger millet were drastically enhanced when fermented with GRAS filamentous fungi. During the enzymatic bioprocessing, most of the phenolics released during the hydrolysis, leached out into the liquid portion rather than retaining them within the millet grain, resulting in overall loss of dietary antioxidant. The present study establishes the most effective strategy to enrich the finger millet with phenolic antioxidants.

  1. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. On the nonlinear dynamics of trolling-mode AFM: Analytical solution using multiple time scales method

    Science.gov (United States)

    Sajjadi, Mohammadreza; Pishkenari, Hossein Nejat; Vossoughi, Gholamreza

    2018-06-01

    Trolling mode atomic force microscopy (TR-AFM) has resolved many imaging problems by a considerable reduction of the liquid-resonator interaction forces in liquid environments. The present study develops a nonlinear model of the meniscus force exerted to the nanoneedle of TR-AFM and presents an analytical solution to the distributed-parameter model of TR-AFM resonator utilizing multiple time scales (MTS) method. Based on the developed analytical solution, the frequency-response curves of the resonator operation in air and liquid (for different penetration length of the nanoneedle) are obtained. The closed-form analytical solution and the frequency-response curves are validated by the comparison with both the finite element solution of the main partial differential equations and the experimental observations. The effect of excitation angle of the resonator on horizontal oscillation of the probe tip and the effect of different parameters on the frequency-response of the system are investigated.

  3. Improved ethanol production at high temperature by consolidated bioprocessing using Saccharomyces cerevisiae strain engineered with artificial zinc finger protein.

    Science.gov (United States)

    Khatun, M Mahfuza; Yu, Xinshui; Kondo, Akihiko; Bai, Fengwu; Zhao, Xinqing

    2017-12-01

    In this work, the consolidated bioprocessing (CBP) yeast Saccharomyces cerevisiae MNII/cocδBEC3 was transformed by an artificial zinc finger protein (AZFP) library to improve its thermal tolerance, and the strain MNII-AZFP with superior growth at 42°C was selected. Improved degradation of acid swollen cellulose by 45.9% led to an increase in ethanol production, when compared to the control strain. Moreover, the fermentation of Jerusalem artichoke stalk (JAS) by MNII-AZFP was shortened by 12h at 42°C with a concomitant improvement in ethanol production. Comparative transcriptomics analysis suggested that the AZFP in the mutant exerted beneficial effect by modulating the expression of multiple functional genes. These results provide a feasible strategy for efficient ethanol production from JAS and other cellulosic biomass through CBP based-fermentation at elevated temperatures. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Intelligent control of mixed-culture bioprocesses

    International Nuclear Information System (INIS)

    Stoner, D.L.; Larsen, E.D.; Miller, K.S.

    1995-01-01

    A hierarchical control system is being developed and applied to a mixed culture bioprocess in a continuous stirred tank reactor. A bioreactor, with its inherent complexity and non-linear behavior was an interesting, yet, difficult application for control theory. The bottom level of the hierarchy was implemented as a number of integrated set point controls and data acquisition modules. Within the second level was a diagnostic system that used expert knowledge to determine the operational status of the sensors, actuators, and control modules. A diagnostic program was successfully implemented for the detection of stirrer malfunctions, and to monitor liquid delivery rates and recalibrate the pumps when deviations from desired flow rates occurred. The highest control level was a supervisory shell that was developed using expert knowledge and the history of the reactor operation to determine the set points required to meet a set of production criteria. At this stage the supervisory shell analyzed the data to determine the state of the system. In future implementations, this shell will determine the set points required to optimize a cost function using expert knowledge and adaptive learning techniques

  5. Intelligent control of mixed-culture bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Stoner, D.L.; Larsen, E.D.; Miller, K.S. [Idaho National Engineering Lab., Idaho Falls, ID (United States)] [and others

    1995-12-31

    A hierarchical control system is being developed and applied to a mixed culture bioprocess in a continuous stirred tank reactor. A bioreactor, with its inherent complexity and non-linear behavior was an interesting, yet, difficult application for control theory. The bottom level of the hierarchy was implemented as a number of integrated set point controls and data acquisition modules. Within the second level was a diagnostic system that used expert knowledge to determine the operational status of the sensors, actuators, and control modules. A diagnostic program was successfully implemented for the detection of stirrer malfunctions, and to monitor liquid delivery rates and recalibrate the pumps when deviations from desired flow rates occurred. The highest control level was a supervisory shell that was developed using expert knowledge and the history of the reactor operation to determine the set points required to meet a set of production criteria. At this stage the supervisory shell analyzed the data to determine the state of the system. In future implementations, this shell will determine the set points required to optimize a cost function using expert knowledge and adaptive learning techniques.

  6. Process Analytical Technology for Advanced Process Control in Biologics Manufacturing with the Aid of Macroscopic Kinetic Modeling

    Directory of Open Access Journals (Sweden)

    Martin Kornecki

    2018-03-01

    Full Text Available Productivity improvements of mammalian cell culture in the production of recombinant proteins have been made by optimizing cell lines, media, and process operation. This led to enhanced titers and process robustness without increasing the cost of the upstream processing (USP; however, a downstream bottleneck remains. In terms of process control improvement, the process analytical technology (PAT initiative, initiated by the American Food and Drug Administration (FDA, aims to measure, analyze, monitor, and ultimately control all important attributes of a bioprocess. Especially, spectroscopic methods such as Raman or near-infrared spectroscopy enable one to meet these analytical requirements, preferably in-situ. In combination with chemometric techniques like partial least square (PLS or principal component analysis (PCA, it is possible to generate soft sensors, which estimate process variables based on process and measurement models for the enhanced control of bioprocesses. Macroscopic kinetic models can be used to simulate cell metabolism. These models are able to enhance the process understanding by predicting the dynamic of cells during cultivation. In this article, in-situ turbidity (transmission, 880 nm and ex-situ Raman spectroscopy (785 nm measurements are combined with an offline macroscopic Monod kinetic model in order to predict substrate concentrations. Experimental data of Chinese hamster ovary cultivations in bioreactors show a sufficiently linear correlation (R2 ≥ 0.97 between turbidity and total cell concentration. PLS regression of Raman spectra generates a prediction model, which was validated via offline viable cell concentration measurement (RMSE ≤ 13.82, R2 ≥ 0.92. Based on these measurements, the macroscopic Monod model can be used to determine different process attributes, e.g., glucose concentration. In consequence, it is possible to approximately calculate (R2 ≥ 0.96 glucose concentration based on online cell

  7. Process Analytical Technology for Advanced Process Control in Biologics Manufacturing with the Aid of Macroscopic Kinetic Modeling.

    Science.gov (United States)

    Kornecki, Martin; Strube, Jochen

    2018-03-16

    Productivity improvements of mammalian cell culture in the production of recombinant proteins have been made by optimizing cell lines, media, and process operation. This led to enhanced titers and process robustness without increasing the cost of the upstream processing (USP); however, a downstream bottleneck remains. In terms of process control improvement, the process analytical technology (PAT) initiative, initiated by the American Food and Drug Administration (FDA), aims to measure, analyze, monitor, and ultimately control all important attributes of a bioprocess. Especially, spectroscopic methods such as Raman or near-infrared spectroscopy enable one to meet these analytical requirements, preferably in-situ. In combination with chemometric techniques like partial least square (PLS) or principal component analysis (PCA), it is possible to generate soft sensors, which estimate process variables based on process and measurement models for the enhanced control of bioprocesses. Macroscopic kinetic models can be used to simulate cell metabolism. These models are able to enhance the process understanding by predicting the dynamic of cells during cultivation. In this article, in-situ turbidity (transmission, 880 nm) and ex-situ Raman spectroscopy (785 nm) measurements are combined with an offline macroscopic Monod kinetic model in order to predict substrate concentrations. Experimental data of Chinese hamster ovary cultivations in bioreactors show a sufficiently linear correlation (R² ≥ 0.97) between turbidity and total cell concentration. PLS regression of Raman spectra generates a prediction model, which was validated via offline viable cell concentration measurement (RMSE ≤ 13.82, R² ≥ 0.92). Based on these measurements, the macroscopic Monod model can be used to determine different process attributes, e.g., glucose concentration. In consequence, it is possible to approximately calculate (R² ≥ 0.96) glucose concentration based on online cell

  8. Developing a mesophilic co-culture for direct conversion of cellulose to butanol in consolidated bioprocess.

    Science.gov (United States)

    Wang, Zhenyu; Cao, Guangli; Zheng, Ju; Fu, Defeng; Song, Jinzhu; Zhang, Junzheng; Zhao, Lei; Yang, Qian

    2015-01-01

    Consolidated bioprocessing (CBP) of butanol production from cellulosic biomass is a promising strategy for cost saving compared to other processes featuring dedicated cellulase production. CBP requires microbial strains capable of hydrolyzing biomass with enzymes produced on its own with high rate and high conversion and simultaneously produce a desired product at high yield. However, current reported butanol-producing candidates are unable to utilize cellulose as a sole carbon source and energy source. Consequently, developing a co-culture system using different microorganisms by taking advantage of their specific metabolic capacities to produce butanol directly from cellulose in consolidated bioprocess is of great interest. This study was mainly undertaken to find complementary organisms to the butanol producer that allow simultaneous saccharification and fermentation of cellulose to butanol in their co-culture under mesophilic condition. Accordingly, a highly efficient and stable consortium N3 on cellulose degradation was first developed by multiple subcultures. Subsequently, the functional microorganisms with 16S rRNA sequences identical to the denaturing gradient gel electrophoresis (DGGE) profile were isolated from consortium N3. The isolate Clostridium celevecrescens N3-2 exhibited higher cellulose-degrading capability was thus chosen as the partner strain for butanol production with Clostridium acetobutylicum ATCC824. Meanwhile, the established stable consortium N3 was also investigated to produce butanol by co-culturing with C. acetobutylicum ATCC824. Butanol was produced from cellulose when C. acetobutylicum ATCC824 was co-cultured with either consortium N3 or C. celevecrescens N3-2. Co-culturing C. acetobutylicum ATCC824 with the stable consortium N3 resulted in a relatively higher butanol concentration, 3.73 g/L, and higher production yield, 0.145 g/g of glucose equivalent. The newly isolated microbial consortium N3 and strain C. celevecrescens N3

  9. Multiplicity distributions of gluon and quark jets and tests of QCD analytic predictions

    CERN Document Server

    Ackerstaff, K; Allison, J; Altekamp, N; Anderson, K J; Anderson, S; Arcelli, S; Asai, S; Axen, D A; Azuelos, Georges; Ball, A H; Barberio, E; Barlow, R J; Bartoldus, R; Batley, J Richard; Baumann, S; Bechtluft, J; Beeston, C; Behnke, T; Bell, A N; Bell, K W; Bella, G; Bentvelsen, Stanislaus Cornelius Maria; Bethke, Siegfried; Biebel, O; Biguzzi, A; Bird, S D; Blobel, Volker; Bloodworth, Ian J; Bloomer, J E; Bobinski, M; Bock, P; Bonacorsi, D; Boutemeur, M; Bouwens, B T; Braibant, S; Brigliadori, L; Brown, R M; Burckhart, Helfried J; Burgard, C; Bürgin, R; Capiluppi, P; Carnegie, R K; Carter, A A; Carter, J R; Chang, C Y; Charlton, D G; Chrisman, D; Clarke, P E L; Cohen, I; Conboy, J E; Cooke, O C; Cuffiani, M; Dado, S; Dallapiccola, C; Dallavalle, G M; Davis, R; De Jong, S; del Pozo, L A; Desch, Klaus; Dienes, B; Dixit, M S; do Couto e Silva, E; Doucet, M; Duchovni, E; Duckeck, G; Duerdoth, I P; Eatough, D; Edwards, J E G; Estabrooks, P G; Evans, H G; Evans, M; Fabbri, Franco Luigi; Fanti, M; Faust, A A; Fiedler, F; Fierro, M; Fischer, H M; Fleck, I; Folman, R; Fong, D G; Foucher, M; Fürtjes, A; Futyan, D I; Gagnon, P; Gary, J W; Gascon, J; Gascon-Shotkin, S M; Geddes, N I; Geich-Gimbel, C; Geralis, T; Giacomelli, G; Giacomelli, P; Giacomelli, R; Gibson, V; Gibson, W R; Gingrich, D M; Glenzinski, D A; Goldberg, J; Goodrick, M J; Gorn, W; Grandi, C; Gross, E; Grunhaus, Jacob; Gruwé, M; Hajdu, C; Hanson, G G; Hansroul, M; Hapke, M; Hargrove, C K; Hart, P A; Hartmann, C; Hauschild, M; Hawkes, C M; Hawkings, R; Hemingway, Richard J; Herndon, M; Herten, G; Heuer, R D; Hildreth, M D; Hill, J C; Hillier, S J; Hobson, P R; Homer, R James; Honma, A K; Horváth, D; Hossain, K R; Howard, R; Hüntemeyer, P; Hutchcroft, D E; Igo-Kemenes, P; Imrie, D C; Ingram, M R; Ishii, K; Jawahery, A; Jeffreys, P W; Jeremie, H; Jimack, Martin Paul; Joly, A; Jones, C R; Jones, G; Jones, M; Jost, U; Jovanovic, P; Junk, T R; Karlen, D A; Kartvelishvili, V G; Kawagoe, K; Kawamoto, T; Kayal, P I; Keeler, Richard K; Kellogg, R G; Kennedy, B W; Kirk, J; Klier, A; Kluth, S; Kobayashi, T; Kobel, M; Koetke, D S; Kokott, T P; Kolrep, M; Komamiya, S; Kress, T; Krieger, P; Von Krogh, J; Kyberd, P; Lafferty, G D; Lahmann, R; Lai, W P; Lanske, D; Lauber, J; Lautenschlager, S R; Layter, J G; Lazic, D; Lee, A M; Lefebvre, E; Lellouch, Daniel; Letts, J; Levinson, L; Lloyd, S L; Loebinger, F K; Long, G D; Losty, Michael J; Ludwig, J; Macchiolo, A; MacPherson, A L; Mannelli, M; Marcellini, S; Markus, C; Martin, A J; Martin, J P; Martínez, G; Mashimo, T; Mättig, P; McDonald, W J; McKenna, J A; McKigney, E A; McMahon, T J; McPherson, R A; Meijers, F; Menke, S; Merritt, F S; Mes, H; Meyer, J; Michelini, Aldo; Mikenberg, G; Miller, D J; Mincer, A; Mir, R; Mohr, W; Montanari, A; Mori, T; Morii, M; Müller, U; Mihara, S; Nagai, K; Nakamura, I; Neal, H A; Nellen, B; Nisius, R; O'Neale, S W; Oakham, F G; Odorici, F; Ögren, H O; Oh, A; Oldershaw, N J; Oreglia, M J; Orito, S; Pálinkás, J; Pásztor, G; Pater, J R; Patrick, G N; Patt, J; Pearce, M J; Pérez-Ochoa, R; Petzold, S; Pfeifenschneider, P; Pilcher, J E; Pinfold, J L; Plane, D E; Poffenberger, P R; Poli, B; Posthaus, A; Rees, D L; Rigby, D; Robertson, S; Robins, S A; Rodning, N L; Roney, J M; Rooke, A M; Ros, E; Rossi, A M; Routenburg, P; Rozen, Y; Runge, K; Runólfsson, O; Ruppel, U; Rust, D R; Rylko, R; Sachs, K; Saeki, T; Sarkisyan-Grinbaum, E; Sbarra, C; Schaile, A D; Schaile, O; Scharf, F; Scharff-Hansen, P; Schenk, P; Schieck, J; Schleper, P; Schmitt, B; Schmitt, S; Schöning, A; Schröder, M; Schultz-Coulon, H C; Schumacher, M; Schwick, C; Scott, W G; Shears, T G; Shen, B C; Shepherd-Themistocleous, C H; Sherwood, P; Siroli, G P; Sittler, A; Skillman, A; Skuja, A; Smith, A M; Snow, G A; Sobie, Randall J; Söldner-Rembold, S; Springer, R W; Sproston, M; Stephens, K; Steuerer, J; Stockhausen, B; Stoll, K; Strom, D; Szymanski, P; Tafirout, R; Talbot, S D; Tanaka, S; Taras, P; Tarem, S; Teuscher, R; Thiergen, M; Thomson, M A; Von Törne, E; Towers, S; Trigger, I; Trócsányi, Z L; Tsur, E; Turcot, A S; Turner-Watson, M F; Utzat, P; Van Kooten, R; Verzocchi, M; Vikas, P; Vokurka, E H; Voss, H; Wäckerle, F; Wagner, A; Ward, C P; Ward, D R; Watkins, P M; Watson, A T; Watson, N K; Wells, P S; Wermes, N; White, J S; Wilkens, B; Wilson, G W; Wilson, J A; Wolf, G; Wyatt, T R; Yamashita, S; Yekutieli, G; Zacek, V; Zer-Zion, D

    1999-01-01

    Gluon jets are identified in e+e- hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The charged particle multiplicity distribution of the gluon jets is presented, and is analyzed for its mean, dispersion, skew, and curtosis values, and for its factorial and cumulant moments. The results are compared to the analogous results found for a sample of light quark (uds) jets, also defined inclusively. We observe differences between the mean, skew and curtosis values of gluon and quark jets, but not between their dispersions. The cumulant moment results are compared to the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observe...

  10. Nano-tubular cellulose for bioprocess technology development.

    Directory of Open Access Journals (Sweden)

    Athanasios A Koutinas

    Full Text Available Delignified cellulosic material has shown a significant promotional effect on the alcoholic fermentation as yeast immobilization support. However, its potential for further biotechnological development is unexploited. This study reports the characterization of this tubular/porous cellulosic material, which was done by SEM, porosimetry and X-ray powder diffractometry. The results showed that the structure of nano-tubular cellulose (NC justifies its suitability for use in "cold pasteurization" processes and its promoting activity in bioprocessing (fermentation. The last was explained by a glucose pump theory. Also, it was demonstrated that crystallization of viscous invert sugar solutions during freeze drying could not be otherwise achieved unless NC was present. This effect as well as the feasibility of extremely low temperature fermentation are due to reduction of the activation energy, and have facilitated the development of technologies such as wine fermentations at home scale (in a domestic refrigerator. Moreover, NC may lead to new perspectives in research such as the development of new composites, templates for cylindrical nano-particles, etc.

  11. A factor analytic investigation of the Mercy Evaluation of Multiple Sclerosis.

    Science.gov (United States)

    Merz, Zachary C; Wright, John D; Vander Wal, Jillon S; Gfeller, Jeffrey D

    2018-01-23

    Neurocognitive deficits commonly are an accompanying feature of Multiple Sclerosis (MS). A brief, yet comprehensive neuropsychological battery is desirable for assessing the extent of these deficits. Therefore, the present study examined the validity of the Mercy Evaluation of Multiple Sclerosis (MEMS) for use with the MS population. Archival data from individuals diagnosed with MS (N = 378) by independent neurologists was examined. Cognitive domains assessed included processing speed and attention, learning, and memory, visuospatial, language, and executive functioning. A mean battery index was calculated to provide a general indicator of cognitive impairment within the current sample. Overall performance across participants was found to be in the lower limits of the average range. Results of factor analytic statistical procedures yielded a four-factor solution, accounting for 67% of total variance within the MEMS. Four neurocognitive measures exhibited the highest sensitivity in detecting cognitive impairment, constituting a psychometrically established brief cognitive screening battery, which accounted for 83% of total variance within the mean battery index score. Overall, the results of the current study suggest appropriate construct validity of the MEMS for use with individuals with MS, as well as provide support for previously established cognitive batteries.

  12. Selection of controlled variables in bioprocesses. Application to a SHARON-Anammox process for autotrophic nitrogen removal

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Valverde Perez, Borja; Sin, Gürkan

    Selecting the right controlled variables in a bioprocess is challenging since the objectives of the process (yields, product or substrate concentration) are difficult to relate with a given actuator. We apply here process control tools that can be used to assist in the selection of controlled var...... variables to the case of the SHARON-Anammox process for autotrophic nitrogen removal....

  13. Bio-processing of solid wastes and secondary resources for metal extraction – A review

    International Nuclear Information System (INIS)

    Lee, Jae-chun; Pandey, Banshi Dhar

    2012-01-01

    Highlights: ► Review focuses on bio-extraction of metals from solid wastes of industries and consumer goods. ► Bio-processing of certain effluents/wastewaters with metals is also included in brief. ► Quantity/composition of wastes are assessed, and microbes used and leaching conditions included. ► Bio-recovery using bacteria, fungi and archaea is highlighted for resource recycling. ► Process methodology/mechanism, R and D direction and scope of large scale use are briefly included. - Abstract: Metal containing wastes/byproducts of various industries, used consumer goods, and municipal waste are potential pollutants, if not treated properly. They may also be important secondary resources if processed in eco-friendly manner for secured supply of contained metals/materials. Bio-extraction of metals from such resources with microbes such as bacteria, fungi and archaea is being increasingly explored to meet the twin objectives of resource recycling and pollution mitigation. This review focuses on the bio-processing of solid wastes/byproducts of metallurgical and manufacturing industries, chemical/petrochemical plants, electroplating and tanning units, besides sewage sludge and fly ash of municipal incinerators, electronic wastes (e-wastes/PCBs), used batteries, etc. An assessment has been made to quantify the wastes generated and its compositions, microbes used, metal leaching efficiency etc. Processing of certain effluents and wastewaters comprising of metals is also included in brief. Future directions of research are highlighted.

  14. Bioprocessing applications in the management of nuclear and chemical wastes

    International Nuclear Information System (INIS)

    Genung, R.K.

    1988-01-01

    The projected requirements for waste management and environmental restoration activities within the United States will probably cost tens of billions of dollars annually during the next two decades. Expenditures of this magnitude clearly have the potential to affect the international competitiveness of many US industries and the continued operation of many federal facilities. It is argued that the costs of implementing current technology will be too high unless the standards and schedules for compliance are relaxed. Since this is socially unacceptable, efforts to improve the efficiency of existing technologies and to develop new technologies should be pursued. A sizable research, development, and demonstration effort can be easily justified if the potential for reducing costs can be shown. Bioprocessing systems for the treatment of nuclear and chemically hazardous wastes offer such promise. 11 refs

  15. On Thermally Interacting Multiple Boreholes with Variable Heating Strength: Comparison between Analytical and Numerical Approaches

    Directory of Open Access Journals (Sweden)

    Marc A. Rosen

    2012-08-01

    Full Text Available The temperature response in the soil surrounding multiple boreholes is evaluated analytically and numerically. The assumption of constant heat flux along the borehole wall is examined by coupling the problem to the heat transfer problem inside the borehole and presenting a model with variable heat flux along the borehole length. In the analytical approach, a line source of heat with a finite length is used to model the conduction of heat in the soil surrounding the boreholes. In the numerical method, a finite volume method in a three dimensional meshed domain is used. In order to determine the heat flux boundary condition, the analytical quasi-three-dimensional solution to the heat transfer problem of the U-tube configuration inside the borehole is used. This solution takes into account the variation in heating strength along the borehole length due to the temperature variation of the fluid running in the U-tube. Thus, critical depths at which thermal interaction occurs can be determined. Finally, in order to examine the validity of the numerical method, a comparison is made with the results of line source method.

  16. Ensuring comparability of data generated by multiple analytical laboratories for environmental decision making at the Fernald Environmental Management Project

    International Nuclear Information System (INIS)

    Sutton, C.; Campbell, B.A.; Danahy, R.J.; Dugan, T.A.; Tomlinson, F.K.

    1994-01-01

    The Fernald Environmental Management Project is a US Department of Energy (DOE)-owned facility located 17 miles northwest of Cincinnati, Ohio. From 1952 until 1989, the Fernald site provided high-purity uranium metal products to support US defense programs. In 1989 the mission of Fernald changed from one of uranium production to one of environmental restoration. At Fernald, multiple functional programs require analytical data. Inorganic and organic data for these programs are currently generated by seven laboratories, while radiochemical data are being obtained from six laboratories. Quality Assurance (QA) and Quality Control (QC) programs have been established to help ensure comparability of data generated by multiple laboratories at different times. The quality assurance program for organic and inorganic measurements specifies which analytical methodologies and sample preparation procedures are to be used based on analyte class, sample matrix, and data quality requirements. In contrast, performance specifications have been established for radiochemical analyses. A blind performance evaluation program for all laboratories, both on-site and subcontracted commercial laboratories, provides continuous feedback on data quality. The necessity for subcontractor laboratories to participate in the performance evaluation program is a contractual requirement. Similarly, subcontract laboratories are contractually required to generate data which meet radiochemical performance specifications. The Fernald on-site laboratory must also fulfill these requirements

  17. Scale breaking parton fragmentation functions, analytical parametrizations and comparison with charged multiplicities in e+e- annihilation

    International Nuclear Information System (INIS)

    Perlt, H.

    1980-01-01

    Scale breaking quark and gluon fragmentation functions obtained by solving numerically Altarelli-Parisi type equations are presented. Analytical parametrizations are given for the fragmentation of u and d quarks into pions. The calculated Q 2 dependent fragmentation functions are compared with experimental data. With these scale breaking fragmentation functions the average charged multiplicity is calculated in e + e - annihilation, which rises with energy more than logarithmically and is in good agreement with experiment. (author)

  18. Bioprocessing applications in the management of nuclear and chemical wastes

    International Nuclear Information System (INIS)

    Genung, R.K.

    1989-01-01

    The US Department of Energy (DOE), the US Department of Defense (DOD), and other federal agencies already face profound challenges in finding strategies that manage budgets and priorities while bringing their sites and facilities into compliance with current statues and regulations and with agency policies and orders. While it is often agreed that current technology can be used to address most waste management and environmental restoration needs, it is also argued by many that the costs of implementing current technology will be too high unless the standards and schedules for compliance are relaxed. Since this is socially unacceptable, efforts to improve the efficiency of existing technologies and to develop new technologies should be pursued. A sizable research, development, and demonstration effort can be easily justified if the potential for reducing costs can be shown. Bioprocessing systems for the treatment of nuclear and chemically hazardous wastes offer such promise

  19. Advances in consolidated bioprocessing systems for bioethanol and butanol production from biomass: a comprehensive review

    Directory of Open Access Journals (Sweden)

    Gholamreza Salehi Jouzani

    2015-03-01

    Full Text Available Recently, lignocellulosic biomass as the most abundant renewable resource has been widely considered for bioalcohols production. However, the complex structure of lignocelluloses requires a multi-step process which is costly and time consuming. Although, several bioprocessing approaches have been developed for pretreatment, saccharification and fermentation, bioalcohols production from lignocelluloses is still limited because of the economic infeasibility of these technologies. This cost constraint could be overcome by designing and constructing robust cellulolytic and bioalcohols producing microbes and by using them in a consolidated bioprocessing (CBP system. This paper comprehensively reviews potentials, recent advances and challenges faced in CBP systems for efficient bioalcohols (ethanol and butanol production from lignocellulosic and starchy biomass. The CBP strategies include using native single strains with cellulytic and alcohol production activities, microbial co-cultures containing both cellulytic and ethanologenic microorganisms, and genetic engineering of cellulytic microorganisms to be alcohol-producing or alcohol producing microorganisms to be cellulytic. Moreover, high-throughput techniques, such as metagenomics, metatranscriptomics, next generation sequencing and synthetic biology developed to explore novel microorganisms and powerful enzymes with high activity, thermostability and pH stability are also discussed. Currently, the CBP technology is in its infant stage, and ideal microorganisms and/or conditions at industrial scale are yet to be introduced. So, it is essential to bring into attention all barriers faced and take advantage of all the experiences gained to achieve a high-yield and low-cost CBP process.

  20. Implementation and use of cloud-based electronic lab notebook in a bioprocess engineering teaching laboratory.

    Science.gov (United States)

    Riley, Erin M; Hattaway, Holly Z; Felse, P Arthur

    2017-01-01

    Electronic lab notebooks (ELNs) are better equipped than paper lab notebooks (PLNs) to handle present-day life science and engineering experiments that generate large data sets and require high levels of data integrity. But limited training and a lack of workforce with ELN knowledge have restricted the use of ELN in academic and industry research laboratories which still rely on cumbersome PLNs for recordkeeping. We used LabArchives, a cloud-based ELN in our bioprocess engineering lab course to train students in electronic record keeping, good documentation practices (GDPs), and data integrity. Implementation of ELN in the bioprocess engineering lab course, an analysis of user experiences, and our development actions to improve ELN training are presented here. ELN improved pedagogy and learning outcomes of the lab course through stream lined workflow, quick data recording and archiving, and enhanced data sharing and collaboration. It also enabled superior data integrity, simplified information exchange, and allowed real-time and remote monitoring of experiments. Several attributes related to positive user experiences of ELN improved between the two subsequent years in which ELN was offered. Student responses also indicate that ELN is better than PLN for compliance. We demonstrated that ELN can be successfully implemented in a lab course with significant benefits to pedagogy, GDP training, and data integrity. The methods and processes presented here for ELN implementation can be adapted to many types of laboratory experiments.

  1. Development of bioprocess for high density cultivation yield of the probiotic Bacillus coagulans and its spores

    Directory of Open Access Journals (Sweden)

    Kavita R. Pandey

    2016-09-01

    Full Text Available Bacillus coagulans is a spore forming lactic acid bacterium. Spore forming bacteria, have been extensively studied and commercialized as probiotics. Probiotics are produced by fermentation technology. There is a limitation to biomass produced by conventional modes of fermentation. With the great demand generated by range of probiotic products, biomass is becoming very valuable for several pharmaceutical, dairy and probiotic companies. Thus, there is a need to develop high cell density cultivation processes for enhanced biomass accumulation. The bioprocess development was carried out in 6.6 L bench top lab scale fermentor. Four different cultivation strategies were employed to develop a bioprocess for higher growth and sporulation efficiencies of probiotic B. coagulans. Batch fermentation of B. coagulans yielded 18 g L-1 biomass (as against 8.0 g L-1 productivity in shake flask with 60% spore efficiency. Fed-batch cultivation was carried out for glucose, which yielded 25 g L-1 of biomass. C/N ratio was very crucial in achieving higher spore titres. Maximum biomass yield recorded was 30 g L-1, corresponding to 3.8 × 1011 cells mL-1 with 81% of cells in sporulated stage. The yield represents increment of 85 times the productivity and 158 times the spore titres relative to the highest reported values for high density cultivation of B. coagulans.

  2. Selection of bioprocess simulation software for industrial applications.

    Science.gov (United States)

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues. Copyright 2001 John Wiley & Sons, Inc.

  3. Multiplicity distributions of gluon and quark jets and tests of QCD analytic predictions

    Science.gov (United States)

    OPAL Collaboration; Ackerstaff, K.; et al.

    Gluon jets are identified in e+e^- hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The charged particle multiplicity distribution of the gluon jets is presented, and is analyzed for its mean, dispersion, skew, and curtosis values, and for its factorial and cumulant moments. The results are compared to the analogous results found for a sample of light quark (uds) jets, also defined inclusively. We observe differences between the mean, skew and curtosis values of gluon and quark jets, but not between their dispersions. The cumulant moment results are compared to the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the data compared to a next-to-leading order calculation without energy conservation. There is agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets.

  4. Dynamics of complex interconnected systems: Networks and bioprocesses[A NATO study seminary

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, Line K

    2005-07-01

    Rapid detection of chemical and biological agents and weapons, and rapid diagnosis of their effects on people will require molecular recognition as well as signal discrimination, i.e. avoiding false positives and negatives, and signal transduction. It will be important to have reagentless, cheap, easily manufactured sensors that can be field deployed in large numbers. While this problem is urgent it is not yet solved. This ASI brought together researchers with various interests and background including theoretical physicists, soft condensed matter experimentalists, biological physicists, and molecular biologists to identify and discuss areas where synergism between modem physics and biology may be most fruitfully applied to the study of bioprocesses for molecular recognition and of networks for converting molecular reactions into usable signals and appropriate responses. (Author)

  5. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    Science.gov (United States)

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  6. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture

    Science.gov (United States)

    Pollock, James; Coffman, Jon; Ho, Sa V.

    2017-01-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete‐event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision‐making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E‐factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium‐sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed‐batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision‐making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854–866, 2017

  7. Scale-up bioprocess development for production of the antibiotic valinomycin in Escherichia coli based on consistent fed-batch cultivations.

    Science.gov (United States)

    Li, Jian; Jaitzig, Jennifer; Lu, Ping; Süssmuth, Roderich D; Neubauer, Peter

    2015-06-12

    Heterologous production of natural products in Escherichia coli has emerged as an attractive strategy to obtain molecules of interest. Although technically feasible most of them are still constrained to laboratory scale production. Therefore, it is necessary to develop reasonable scale-up strategies for bioprocesses aiming at the overproduction of targeted natural products under industrial scale conditions. To this end, we used the production of the antibiotic valinomycin in E. coli as a model system for scalable bioprocess development based on consistent fed-batch cultivations. In this work, the glucose limited fed-batch strategy based on pure mineral salt medium was used throughout all scales for valinomycin production. The optimal glucose feed rate was initially detected by the use of a biocatalytically controlled glucose release (EnBase® technology) in parallel cultivations in 24-well plates with continuous monitoring of pH and dissolved oxygen. These results were confirmed in shake flasks, where the accumulation of valinomycin was highest when the specific growth rate decreased below 0.1 h(-1). This correlation was also observed for high cell density fed-batch cultivations in a lab-scale bioreactor. The bioreactor fermentation produced valinomycin with titers of more than 2 mg L(-1) based on the feeding of a concentrated glucose solution. Valinomycin production was not affected by oscillating conditions (i.e. glucose and oxygen) in a scale-down two-compartment reactor, which could mimic similar situations in industrial bioreactors, suggesting that the process is very robust and a scaling of the process to a larger industrial scale appears a realistic scenario. Valinomycin production was scaled up from mL volumes to 10 L with consistent use of the fed-batch technology. This work presents a robust and reliable approach for scalable bioprocess development and represents an example for the consistent development of a process for a heterologously expressed natural

  8. Development of a high-throughput microscale cell disruption platform for Pichia pastoris in rapid bioprocess design.

    Science.gov (United States)

    Bláha, Benjamin A F; Morris, Stephen A; Ogonah, Olotu W; Maucourant, Sophie; Crescente, Vincenzo; Rosenberg, William; Mukhopadhyay, Tarit K

    2018-01-01

    The time and cost benefits of miniaturized fermentation platforms can only be gained by employing complementary techniques facilitating high-throughput at small sample volumes. Microbial cell disruption is a major bottleneck in experimental throughput and is often restricted to large processing volumes. Moreover, for rigid yeast species, such as Pichia pastoris, no effective high-throughput disruption methods exist. The development of an automated, miniaturized, high-throughput, noncontact, scalable platform based on adaptive focused acoustics (AFA) to disrupt P. pastoris and recover intracellular heterologous protein is described. Augmented modes of AFA were established by investigating vessel designs and a novel enzymatic pretreatment step. Three different modes of AFA were studied and compared to the performance high-pressure homogenization. For each of these modes of cell disruption, response models were developed to account for five different performance criteria. Using multiple responses not only demonstrated that different operating parameters are required for different response optima, with highest product purity requiring suboptimal values for other criteria, but also allowed for AFA-based methods to mimic large-scale homogenization processes. These results demonstrate that AFA-mediated cell disruption can be used for a wide range of applications including buffer development, strain selection, fermentation process development, and whole bioprocess integration. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 34:130-140, 2018. © 2017 American Institute of Chemical Engineers.

  9. Development of precise analytical methods for strontium and lanthanide isotopic ratios using multiple collector inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Ohno, Takeshi; Takaku, Yuichi; Hisamatsu, Shun'ichi

    2007-01-01

    We have developed precise analytical methods for strontium and lanthanide isotopic ratios using multiple collector-ICP-mass spectrometry (MC-ICP-MS) for experimental and environmental studies of their behavior. In order to obtain precise isotopic data using MC-ICP-MS, the mass discrimination effect was corrected by an exponential law correction method. The resulting isotopic data demonstrated that highly precise isotopic analyses (better than 0.1 per mille as 2SD) could be achieved. We also adopted a de-solvating nebulizer system to improve the sensitivity. This system could minimize the water load into the plasma and provided about five times larger intensity of analyte than a conventional nebulizer system did. (author)

  10. New perspectives for the petroleum industry. Bioprocesses for the selective removal of sulphur, nitrogen and metals

    International Nuclear Information System (INIS)

    Zerlia, T.

    2000-01-01

    Fuel biocatalytic conversion is a process that removes, through selective enzyme-catalyzed reactions, sulphur, nitrogen and metals. The mild operating conditions, the specificity of reactions and the quality of coproducts (particularly the organo sulphur compounds, a source for the petrochemical industry) are just a few of the attractive aspects of this new technology which could open a new world of possibilities in the technology and in the environmental impact of fuels. The paper shows the state-of-the-art of the research and applications of bioprocesses to the petroleum field [it

  11. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  12. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  13. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture.

    Science.gov (United States)

    Pollock, James; Coffman, Jon; Ho, Sa V; Farid, Suzanne S

    2017-07-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete-event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision-making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E-factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium-sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed-batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision-making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854-866, 2017. © 2017 The

  14. Bessel Fourier orientation reconstruction: an analytical EAP reconstruction using multiple shell acquisitions in diffusion MRI.

    Science.gov (United States)

    Hosseinbor, Ameer Pasha; Chung, Moo K; Wu, Yu-Chien; Alexander, Andrew L

    2011-01-01

    The estimation of the ensemble average propagator (EAP) directly from q-space DWI signals is an open problem in diffusion MRI. Diffusion spectrum imaging (DSI) is one common technique to compute the EAP directly from the diffusion signal, but it is burdened by the large sampling required. Recently, several analytical EAP reconstruction schemes for multiple q-shell acquisitions have been proposed. One, in particular, is Diffusion Propagator Imaging (DPI) which is based on the Laplace's equation estimation of diffusion signal for each shell acquisition. Viewed intuitively in terms of the heat equation, the DPI solution is obtained when the heat distribution between temperatuere measurements at each shell is at steady state. We propose a generalized extension of DPI, Bessel Fourier Orientation Reconstruction (BFOR), whose solution is based on heat equation estimation of the diffusion signal for each shell acquisition. That is, the heat distribution between shell measurements is no longer at steady state. In addition to being analytical, the BFOR solution also includes an intrinsic exponential smootheing term. We illustrate the effectiveness of the proposed method by showing results on both synthetic and real MR datasets.

  15. Mesoderm Lineage 3D Tissue Constructs Are Produced at Large-Scale in a 3D Stem Cell Bioprocess.

    Science.gov (United States)

    Cha, Jae Min; Mantalaris, Athanasios; Jung, Sunyoung; Ji, Yurim; Bang, Oh Young; Bae, Hojae

    2017-09-01

    Various studies have presented different approaches to direct pluripotent stem cell differentiation such as applying defined sets of exogenous biochemical signals and genetic/epigenetic modifications. Although differentiation to target lineages can be successfully regulated, such conventional methods are often complicated, laborious, and not cost-effective to be employed to the large-scale production of 3D stem cell-based tissue constructs. A 3D-culture platform that could realize the large-scale production of mesoderm lineage tissue constructs from embryonic stem cells (ESCs) is developed. ESCs are cultured using our previously established 3D-bioprocess platform which is amenable to mass-production of 3D ESC-based tissue constructs. Hepatocarcinoma cell line conditioned medium is introduced to the large-scale 3D culture to provide a specific biomolecular microenvironment to mimic in vivo mesoderm formation process. After 5 days of spontaneous differentiation period, the resulting 3D tissue constructs are composed of multipotent mesodermal progenitor cells verified by gene and molecular expression profiles. Subsequently the optimal time points to trigger terminal differentiation towards cardiomyogenesis or osteogenesis from the mesodermal tissue constructs is found. A simple and affordable 3D ESC-bioprocess that can reach the scalable production of mesoderm origin tissues with significantly improved correspondent tissue properties is demonstrated. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Consolidated bioprocessing for production of polyhydroxyalkanotes from red algae Gelidium amansii.

    Science.gov (United States)

    Sawant, Shailesh S; Salunke, Bipinchandra K; Kim, Beom Soo

    2018-04-01

    Noncompetitive carbon sources such as algae are unconventional and promising raw material for sustainable biofuel production. The capability of one marine bacterium, Saccharophagus degradans 2-40 to degrade red seaweed Gelidium amansii for production of polyhydroxyalkanoates (PHA) was evaluated in this study. S. degradans can readily attach to algae, degrade algal carbohydrates, and utilize that material as main carbon source. Minimal media containing 8g/L G. amansii were used for the growth of S. degradans. The PHA content obtained was 17-27% of dry cell weight by pure culture of S. degradans and co-culture of S. degradans and Bacillus cereus, a contaminant found with S. degradans cultures. The PHA type was found to be poly(3-hydroxybutyrate) by gas chromatography and Fourier transform-infrared spectroscopy. This work demonstrates PHA production through consolidated bioprocessing of insoluble, untreated red algae by bacterial pure culture and co-culture. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology.

    Science.gov (United States)

    Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios

    2012-01-01

    The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  18. Bioprocess systems engineering: transferring traditional process engineering principles to industrial biotechnology

    Directory of Open Access Journals (Sweden)

    Michalis Koutinas

    2012-10-01

    Full Text Available The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control & optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  19. BIOPROCESS SYSTEMS ENGINEERING: TRANSFERRING TRADITIONAL PROCESS ENGINEERING PRINCIPLES TO INDUSTRIAL BIOTECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Michalis Koutinas

    2012-10-01

    Full Text Available The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.

  20. Micro-electromembrane extraction using multiple free liquid membranes and acceptor solutions - Towards selective extractions of analytes based on their acid-base strength

    Czech Academy of Sciences Publication Activity Database

    Kubáň, Pavel; Seip, K. F.; Gjelstad, A.; Pedersen-Bjergaard, S.

    2016-01-01

    Roč. 943, NOV (2016), s. 64-73 ISSN 0003-2670 R&D Projects: GA ČR(CZ) GA16-09135S Institutional support: RVO:68081715 Keywords : multiple phase extraction * electromembrane extraction * plasma Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.950, year: 2016

  1. Micro-electromembrane extraction using multiple free liquid membranes and acceptor solutions - Towards selective extractions of analytes based on their acid-base strength

    Czech Academy of Sciences Publication Activity Database

    Kubáň, Pavel; Seip, K. F.; Gjelstad, A.; Pedersen-Bjergaard, S.

    2016-01-01

    Roč. 943, NOV (2016), s. 64-73 ISSN 0003-2670 R&D Projects: GA ČR(CZ) GA16-09135S Institutional support: RVO:68081715 Keywords : multiple phase extraction * electromembrane extraction * plasma Subject RIV: CB - Analytical Chemistry , Separation Impact factor: 4.950, year: 2016

  2. Bimolecular Master Equations for a Single and Multiple Potential Wells with Analytic Solutions.

    Science.gov (United States)

    Ghaderi, Nima

    2018-04-12

    The analytic solutions, that is, populations, are derived for the K-adiabatic and K-active bimolecular master equations, separately, for a single and multiple potential wells and reaction channels, where K is the component of the total angular momentum J along the axis of least moment of inertia of the recombination products at a given energy E. The analytic approach provides the functional dependence of the population of molecules on its K-active or K-adiabatic dissociation, association rate constants and the intermolecular energy transfer, where the approach may complement the usual numerical approaches for reactions of interest. Our previous work, Part I, considered the solutions for a single potential well, whereby an assumption utilized there is presently obviated in the derivation of the exact solutions and farther discussed. At the high-pressure limit, the K-adiabatic and K-active bimolecular master equations may each reduce, respectively, to the K-adiabatic and K-active bimolecular Rice-Ramsperger-Kassel-Marcus theory (high-pressure limit expressions) for bimolecular recombination rate constant, for a single potential well, and augmented by isomerization terms when multiple potential wells are present. In the low-pressure limit, the expression for population above the dissociation limit, associated with a single potential well, becomes equivalent to the usual presumed detailed balance between the association and dissociation rate constants, where the multiple well case is also considered. When the collision frequency of energy transfer, Z LJ , between the chemical intermediate and bath gas is sufficiently less than the dissociation rate constant k d ( E' J' K') for postcollision ( E' J' K), then the solution for population, g( EJK) + , above the critical energy further simplifies such that depending on Z LJ , the dissociation and association rate constant k r ( EJK), as g( EJK) + = k r ( EJK)A·BC/[ Z LJ + k d ( EJK)], where A and BC are the reactants, for

  3. Postprandial glucose metabolism and SCFA after consuming wholegrain rye bread and wheat bread enriched with bioprocessed rye bran in individuals with mild gastrointestinal symptoms

    DEFF Research Database (Denmark)

    Lappi, J; Mykkänen, H; Knudsen, Knud Erik Bach

    2014-01-01

    BackgroundRye bread benefits glucose metabolism. It is unknown whether the same effect is achieved by rye bran-enriched wheat bread. We tested whether white wheat bread enriched with bioprocessed rye bran (BRB + WW) and sourdough wholegrain rye bread (WGR) have similar effects on glucose metabolism...... and plasma level of short chain fatty acids (SCFAs).  MethodsTwenty-one (12 women) of 23 recruited subjects completed an intervention with a four-week run-in and two four-week test periods in cross-over design. White wheat bread (WW; 3% fibre) was consumed during the run-in, and WGR and BRB + WW (10% fibre.......05) and propionate (p = 0.009) at 30 min increased during both rye bread periods.ConclusionsBeneficial effects of WGR over white wheat bread on glucose and SCFA production were confirmed. The enrichment of the white wheat bread with bioprocessed rye bran (BRB + WW) yielded similar but not as pronounced effects than...

  4. Sound propagation in dilute suspensions of spheres: Analytical comparison between coupled phase model and multiple scattering theory.

    Science.gov (United States)

    Valier-Brasier, Tony; Conoir, Jean-Marc; Coulouvrat, François; Thomas, Jean-Louis

    2015-10-01

    Sound propagation in dilute suspensions of small spheres is studied using two models: a hydrodynamic model based on the coupled phase equations and an acoustic model based on the ECAH (ECAH: Epstein-Carhart-Allegra-Hawley) multiple scattering theory. The aim is to compare both models through the study of three fundamental kinds of particles: rigid particles, elastic spheres, and viscous droplets. The hydrodynamic model is based on a Rayleigh-Plesset-like equation generalized to elastic spheres and viscous droplets. The hydrodynamic forces for elastic spheres are introduced by analogy with those of droplets. The ECAH theory is also modified in order to take into account the velocity of rigid particles. Analytical calculations performed for long wavelength, low dilution, and weak absorption in the ambient fluid show that both models are strictly equivalent for the three kinds of particles studied. The analytical calculations show that dilatational and translational mechanisms are modeled in the same way by both models. The effective parameters of dilute suspensions are also calculated.

  5. Multiple scattering of MeV ions: Comparison between the analytical theory and Monte-Carlo and molecular dynamics simulations

    International Nuclear Information System (INIS)

    Mayer, M.; Arstila, K.; Nordlund, K.; Edelmann, E.; Keinonen, J.

    2006-01-01

    Angular and energy distributions due to multiple small angle scattering were calculated with different models, namely from the analytical Szilagyi theory, the Monte-Carlo code MCERD in binary collision approximation and the molecular dynamics code MDRANGE, for 2 MeV 4 He in Au at backscattering geometry and for 20 MeV 127 I recoil analysis of carbon. The widths and detailed shapes of the distributions are compared, and reasons for deviations between the different models are discussed

  6. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J. William

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP

  7. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  8. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gary, J.W. [California Univ., Riverside, CA (United States). Dept. of Physics

    1999-03-01

    Gluon jets are identified in e{sup +}e{sup -} hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.) 6 refs.

  9. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    International Nuclear Information System (INIS)

    Gary, J.W.

    1999-01-01

    Gluon jets are identified in e + e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon jets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP. (orig.)

  10. Multiplicity distributions of gluon and quark jets and a test of QCD analytic calculations

    Science.gov (United States)

    Gary, J. William

    1999-03-01

    Gluon jets are identified in e +e - hadronic annihilation events by tagging two quark jets in the same hemisphere of an event. The gluon jet is defined inclusively as all the particles in the opposite hemisphere. Gluon hets defined in this manner have a close correspondence to gluon jets as they are defined for analytic calculations, and are almost independent of a jet finding algorithm. The mean and first few higher moments of the gluon jet charged particle multiplicity distribution are compared to the analogous results found for light quark (uds) jets, also defined inclusively. Large differences are observed between the mean, skew and curtosis values of the gluon and quark jets, but not between their dispersions. The cumulant factorial moments of the distributions are also measured, and are used to test the predictions of QCD analytic calculations. A calculation which includes next-to-next-to-leading order corrections and energy conservation is observed to provide a much improved description of the separated gluon and quark jet cumulant moments compared to a next-to-leading order calculation without energy conservation. There is good quantitative agreement between the data and calculations for the ratios of the cumulant moments between gluon and quark jets. The data sample used is the LEP-1 sample of the OPAL experiment at LEP.

  11. Are Higher Education Institutions Prepared for Learning Analytics?

    Science.gov (United States)

    Ifenthaler, Dirk

    2017-01-01

    Higher education institutions and involved stakeholders can derive multiple benefits from learning analytics by using different data analytics strategies to produce summative, real-time, and predictive insights and recommendations. However, are institutions and academic as well as administrative staff prepared for learning analytics? A learning…

  12. Analytical calculations of multiple scattering for high energy photons and neutrons

    International Nuclear Information System (INIS)

    Thoe, R.S.

    1994-04-01

    Radiography of large dense objects often require the use of highly penetrating radiation. For example, a couple of centimeters of steel attenuates 50 keV x-rays by a factor of approximately 10 -14 whereas this same amount of steel would attenuate a 500 keV photon beam by only a factor of about 0.25. However, this increase in penetrating power comes with a price. In the case of x-radiation there are two bills to pay: (1) For projection radiography, this increase in penetration directly causes a corresponding decrease in resolution. (2) This increase in penetration occurs in a region where the interaction of radiation and matter is changing from absorption to scattering. In the above example the fraction of scattering goes from about 0.1 at 50 keV to over 0.99 at 500 keV. These scattered photons can significantly degrade contrast. In order to overcome some of these difficulties, radiography using scattered photons has been studied by myself and numerous other authors. In all the above cases, calculation of the intensity of scattered radiation is of primary importance. In cases where scattering is probable, multiple scattering can also be probable. Calculations of multiple scattering are generally very difficult and usually require the use of extremely sophisticated Monte Carlo simulations. It is not unusual for these calculations to require several hours of CPU time on some of the worlds largest and fastest supercomputers. In this paper I will present an alternative approach. I will present an analytical solution to the equations of double scattering, and show how this solution can extended to the case of higher order scattering. Finally, I will give numerical examples of these solutions and compare them to solutions obtained by Monte Carlo simulations

  13. Production of a generic microbial feedstock for lignocellulose biorefineries through sequential bioprocessing.

    Science.gov (United States)

    Chang, Chen-Wei; Webb, Colin

    2017-03-01

    Lignocellulosic materials, mostly from agricultural and forestry residues, provide a potential renewable resource for sustainable biorefineries. Reducing sugars can be produced only after a pre-treatment stage, which normally involves chemicals but can be biological. In this case, two steps are usually necessary: solid-state cultivation of fungi for deconstruction, followed by enzymatic hydrolysis using cellulolytic enzymes. In this research, the utilisation of solid-state bioprocessing using the fungus Trichoderma longibrachiatum was implemented as a simultaneous microbial pretreatment and in-situ enzyme production method for fungal autolysis and further enzyme hydrolysis of fermented solids. Suspending the fermented solids in water at 50°C led to the highest hydrolysis yields of 226mg/g reducing sugar and 7.7mg/g free amino nitrogen (FAN). The resultant feedstock was shown to be suitable for the production of various products including ethanol. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. HEK293 cell culture media study towards bioprocess optimization: Animal derived component free and animal derived component containing platforms.

    Science.gov (United States)

    Liste-Calleja, Leticia; Lecina, Martí; Cairó, Jordi Joan

    2014-04-01

    The increasing demand for biopharmaceuticals produced in mammalian cells has lead industries to enhance bioprocess volumetric productivity through different strategies. Among those strategies, cell culture media development is of major interest. In the present work, several commercially available culture media for Human Embryonic Kidney cells (HEK293) were evaluated in terms of maximal specific growth rate and maximal viable cell concentration supported. The main objective was to provide different cell culture platforms which are suitable for a wide range of applications depending on the type and the final use of the product obtained. Performing simple media supplementations with and without animal derived components, an enhancement of cell concentration from 2 × 10(6) cell/mL to 17 × 10(6) cell/mL was achieved in batch mode operation. Additionally, the media were evaluated for adenovirus production as a specific application case of HEK293 cells. None of the supplements interfered significantly with the adenovirus infection although some differences were encountered in viral productivity. To the best of our knowledge, the high cell density achieved in the work presented has never been reported before in HEK293 batch cell cultures and thus, our results are greatly promising to further study cell culture strategies in bioreactor towards bioprocess optimization. Copyright © 2013 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  15. Bioprocessing of lignite coals using reductive microorganisms. Final technical report, September 30, 1988--March 29, 1992

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, D.L.

    1992-03-29

    In order to convert lignite coals into liquid fuels, gases or chemical feedstock, the macromolecular structure of the coal must be broken down into low molecular weight fractions prior to further modification. Our research focused on this aspect of coal bioprocessing. We isolated, characterized and studied the lignite coal-depolymerizing organisms Streptomyces viridosporus T7A, Pseudomonas sp. DLC-62, unidentified bacterial strain DLC-BB2 and Gram-positive Bacillus megaterium strain DLC-21. In this research we showed that these bacteria are able to solubilize and depolymerize lignite coals using a combination of biological mechanisms including the excretion of coal solublizing basic chemical metabolites and extracellular coal depolymerizing enzymes.

  16. Advances in Consolidated Bioprocessing Using Clostridium thermocellum and Thermoanaerobacter saccharolyticum

    Energy Technology Data Exchange (ETDEWEB)

    Lynd, Lee R. [Dartmouth College, Thayer School of Engineering; Guss, Adam M. [ORNL; Himmel, Mike [National Renewable Energy Laboratory (NREL); Beri, Dhananjay [Dartmouth College, Thayer School of Engineering; Herring, Christopher [Mascoma Corporation; Holwerda, Evert [Dartmouth College, Thayer School of Engineering; Murphy, Sean J. [Dartmouth College, Thayer School of Engineering; Olson, Daniel G. [Dartmouth College, Thayer School of Engineering; Paye, Julie [Dartmouth College, Thayer School of Engineering; Rydzak, Thomas [ORNL; Shao, Xiongjun [Dartmouth College, Thayer School of Engineering; Tian, Liang [Dartmouth College, Thayer School of Engineering; Worthen, Robert [Dartmouth College, Thayer School of Engineering

    2016-11-01

    Recent advances are addressed pertaining to consolidated bioprocessing (CBP) of plant cell walls to ethanol using two thermophilic, saccharolytic bacteria: the cellulose-fermenting Clostridium thermocellum and the hemicellulose- fermenting ermoanaerobacterium saccharolyticum. On the basis of the largest comparative dataset assembled to date, it appears that C. thermocellum is substantially more effective at solubilizing unpretreated plant cell walls than industry-standard fungal cellulase, and that this is particularly the case for more recalcitrant feedstocks. e distinctive central metabolism of C. thermocellum appears to involve more extensive energy coupling (e.g., on the order of 5 ATP per glucosyl moiety) than most fermentative anaerobes. Ethanol yields and titers realized by engineered strains of T. saccharolyticum meet standards for industrial feasibility and provide an important proof of concept as well as a model that may be emulated in other organisms. Progress has also been made with C. thermocellum, although not yet to this extent. e current state of strain development is summarized and outstanding challenges for commercial application are discussed. We speculate that CBP organism development is more promising starting with naturally occurring cellulolytic microbes as compared to starting with noncellulolytic hosts.

  17. Development and application of an excitation ratiometric optical pH sensor for bioprocess monitoring.

    Science.gov (United States)

    Badugu, Ramachandram; Kostov, Yordan; Rao, Govind; Tolosa, Leah

    2008-01-01

    The development of a fluorescent excitation ratiometric pH sensor (AHQ-PEG) using a novel allylhydroxyquinolinium (AHQ) derivative copolymerized with polyethylene glycol dimethacrylate (PEG) is described. The AHQ-PEG sensor film is shown to be suitable for real-time, noninvasive, continuous, online pH monitoring of bioprocesses. Optical ratiometric measurements are generally more reliable, robust, inexpensive, and insensitive to experimental errors such as fluctuations in the source intensity and fluorophore photobleaching. The sensor AHQ-PEG in deionized water was shown to exhibit two excitation maxima at 375 and 425 nm with a single emission peak at 520 nm. Excitation spectra of AHQ-PEG show a decrease in emission at the 360 nm excitation and an increase at the 420 nm excitation with increasing pH. Accordingly, the ratio of emission at 420:360 nm excitation showed a maximum change between pH 5 and 8 with an apparent pK(a) of 6.40. The low pK(a) value is suitable for monitoring the fermentation of most industrially important microorganisms. Additionally, the AHQ-PEG sensor was shown to have minimal sensitivity to ionic strength and temperature. Because AHQ is covalently attached to PEG, the film shows no probe leaching and is sterilizable by steam and alcohol. It shows rapid (approximately 2 min) and reversible response to pH over many cycles without any photobleaching. Subsequently, the AHQ-PEG sensor film was tested for its suitability in monitoring the pH of S. cereviseae (yeast) fermentation. The observed pH using AHQ-PEG film is in agreement with a conventional glass pH electrode. However, unlike the glass electrode, the present sensor is easily adaptable to noninvasive monitoring of sterilized, closed bioprocess environments without the awkward wire connections that electrodes require. In addition, the AHQ-PEG sensor is easily miniaturized to fit in microwell plates and microbioreactors for high-throughput cell culture applications.

  18. Multiple analyte adduct formation in liquid chromatography-tandem mass spectrometry - Advantages and limitations in the analysis of biologically-related samples.

    Science.gov (United States)

    Dziadosz, Marek

    2018-05-01

    Multiple analyte adduct formation was examined and discussed in the context of reproducible signal detection in liquid chromatography-tandem mass spectrometry applied in the analysis of biologically-related samples. Appropriate infusion solutions were prepared in H 2 O/methanol (3/97, v/v) with 1 mM sodium acetate and 10 mM acetic acid. An API 4000 QTrap tandem mass spectrometer was used for experiments performed in the negative scan mode (-Q1 MS) and the negative enhanced product ion mode (-EPI). γ‑Hydroxybutyrate and its deuterated form were used as model compounds to highlight both the complexity of adduct formation in popular mobile phases used and the effective signal compensation by the application of isotope-labelled analytes as internal standards. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Process integration in bioprocess indystry: waste heat recovery in yeast and ethyl alcohol plant

    International Nuclear Information System (INIS)

    Raskovic, P.; Anastasovski, A.; Markovska, Lj.; Mesko, V.

    2010-01-01

    The process integration of the bioprocess plant for production of yeast and alcohol was studied. Preliminary energy audit of the plant identified the huge amount of thermal losses, caused by waste heat in exhausted process streams, and reviled the great potential for energy efficiency improvement by heat recovery system. Research roadmap, based on process integration approach, is divided on six phases, and the primary tool used for the design of heat recovery network was Pinch Analysis. Performance of preliminary design are obtained by targeting procedure, for three process stream sets, and evaluated by the economic criteria. The results of process integration study are presented in the form of heat exchanger networks which fulfilled the utilization of waste heat and enable considerable savings of energy in short payback period.

  20. Comparative analytics of infusion pump data across multiple hospital systems.

    Science.gov (United States)

    Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith

    2015-02-15

    A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  1. High throughput LC-MS/MS method for the simultaneous analysis of multiple vitamin D analytes in serum.

    Science.gov (United States)

    Jenkinson, Carl; Taylor, Angela E; Hassan-Smith, Zaki K; Adams, John S; Stewart, Paul M; Hewison, Martin; Keevil, Brian G

    2016-03-01

    Recent studies suggest that vitamin D-deficiency is linked to increased risk of common human health problems. To define vitamin D 'status' most routine analytical methods quantify one particular vitamin D metabolite, 25-hydroxyvitamin D3 (25OHD3). However, vitamin D is characterized by complex metabolic pathways, and simultaneous measurement of multiple vitamin D metabolites may provide a more accurate interpretation of vitamin D status. To address this we developed a high-throughput liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to analyse multiple vitamin D analytes, with particular emphasis on the separation of epimer metabolites. A supportive liquid-liquid extraction (SLE) and LC-MS/MS method was developed to quantify 10 vitamin D metabolites as well as separation of an interfering 7α-hydroxy-4-cholesten-3-one (7αC4) isobar (precursor of bile acid), and validated by analysis of human serum samples. In a cohort of 116 healthy subjects, circulating concentrations of 25-hydroxyvitamin D3 (25OHD3), 3-epi-25-hydroxyvitamin D3 (3-epi-25OHD3), 24,25-dihydroxyvitamin D3 (24R,25(OH)2D3), 1,25-dihydroxyvitamin D3 (1α,25(OH)2D3), and 25-hydroxyvitamin D2 (25OHD2) were quantifiable using 220μL of serum, with 25OHD3 and 24R,25(OH)2D3 showing significant seasonal variations. This high-throughput LC-MS/MS method provides a novel strategy for assessing the impact of vitamin D on human health and disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Induction of fungal laccase production under solid state bioprocessing of new agroindustrial waste and its application on dye decolorization.

    Science.gov (United States)

    Akpinar, Merve; Ozturk Urek, Raziye

    2017-06-01

    Lignocellulosic wastes are generally produced in huge amounts worldwide. Peach waste of these obtained from fruit juice industry was utilized as the substrate for laccase production by Pleurotus eryngii under solid state bioprocessing (SSB). Its chemical composition was determined and this bioprocess was carried out under stationary conditions at 28 °C. The effects of different compounds; copper, iron, Tween 80, ammonium nitrate and manganese, and their variable concentrations on laccase production were investigated in detail. The optimum production of laccase (43,761.33 ± 3845 U L -1 ) was achieved on the day of 20 by employing peach waste of 5.0 g and 70 µM Cu 2+ , 18 µM Fe 2+ , 0.025% (v/v) Tween 80, 4.0 g L -1 ammonium nitrate, 750 µM Mn 2+ as the inducers. The dye decolorization also researched to determine the degrading capability of laccase produced from peach culture under the above-mentioned conditions. Within this scope of the study, methyl orange, tartrazine, reactive red 2 and reactive black dyes were treated with this enzyme. The highest decolorization was performed with methyl orange as 43 ± 2.8% after 5 min of treatment when compared to other dyes. Up to now, this is the first report on the induction of laccase production by P. eryngii under SSB using peach waste as the substrate.

  3. On-Line Ion Exchange Liquid Chromatography as a Process Analytical Technology for Monoclonal Antibody Characterization in Continuous Bioprocessing.

    Science.gov (United States)

    Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D

    2017-11-07

    Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.

  4. Analytical Pyrolysis-Chromatography: Something Old, Something New

    Science.gov (United States)

    Bower, Nathan W.; Blanchet, Conor J. K.

    2010-01-01

    Despite a long history of use across multiple disciplines, analytical pyrolysis is rarely taught in undergraduate curricula. We briefly review some interesting applications and discuss the three types of analytical pyrolyzers available commercially. We also describe a low-cost alternative that can be used to teach the basic principles of…

  5. Contributions to Analytic Number Theory | Lucht | Quaestiones ...

    African Journals Online (AJOL)

    This paper reports on some recent contributions to the theory of multiplicative arithmetic semigroups, which have been initiated by John Knopfmacher's work on analytic number theory. They concern weighted inversion theorems of the. Wiener type, mean-value theorems for multiplicative functions, and, Ramanujan

  6. Bio-processing of solid wastes and secondary resources for metal extraction - A review.

    Science.gov (United States)

    Lee, Jae-Chun; Pandey, Banshi Dhar

    2012-01-01

    Metal containing wastes/byproducts of various industries, used consumer goods, and municipal waste are potential pollutants, if not treated properly. They may also be important secondary resources if processed in eco-friendly manner for secured supply of contained metals/materials. Bio-extraction of metals from such resources with microbes such as bacteria, fungi and archaea is being increasingly explored to meet the twin objectives of resource recycling and pollution mitigation. This review focuses on the bio-processing of solid wastes/byproducts of metallurgical and manufacturing industries, chemical/petrochemical plants, electroplating and tanning units, besides sewage sludge and fly ash of municipal incinerators, electronic wastes (e-wastes/PCBs), used batteries, etc. An assessment has been made to quantify the wastes generated and its compositions, microbes used, metal leaching efficiency etc. Processing of certain effluents and wastewaters comprising of metals is also included in brief. Future directions of research are highlighted. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Invitro Study on the Fluid From Banana Stem Bioprocess as Direct Fed Microbial

    Science.gov (United States)

    Mutaqin, B. K.; Tanuwiria, U. H.; Hernawan, E.

    2018-02-01

    The purpose of this research was to study the liquid produced by the bioprocess of banana stem as a Direct Fed Microbial (DFM) in order to enhance local sheep productivity invitro. Studying was the use of DFM in two invitro feeds. The object observed in this research was fermentability and digestibility value. The method was experimental with the experimental design, i.e. factorial experimental design with two factors. The first factor was DFM, the levels of which were 0, 0,2, 0,4 and 0,6%, while the second factor was two feed types (complete feed and Pennisetum purpureum only) with the treatment of threefold repetition. This research showed that fermentability and digestibility value were influenced by the DFM in the invitro complete feed. The research result analyzed using MANOVA with further testing using Duncan Test. The conclusion of the research result were shows the interaction DFM in the complete feed improve fermentability and digestibility values and DFM 0,6% shows the highest value.

  8. Simultaneous production of lipases and biosurfactants by submerged and solid-state bioprocesses.

    Science.gov (United States)

    Colla, Luciane Maria; Rizzardi, Juliana; Pinto, Marta Heidtmann; Reinehr, Christian Oliveira; Bertolin, Telma Elita; Costa, Jorge Alberto Vieira

    2010-11-01

    Lipases and biosurfactants are compounds produced by microorganisms generally involved in the metabolization of oil substrates. However, the relationship between the production of lipases and biosurfactants has not been established yet. Therefore, this study aimed to evaluate the correlation between production of lipases and biosurfactants by submerged (SmgB) and solid-state bioprocess (SSB) using Aspergillus spp., which were isolated from a soil contaminated by diesel oil. SSB had the highest production of lipases, with lipolytic activities of 25.22U, while SmgB had 4.52U. The production of biosurfactants was not observed in the SSB. In the SmgB, correlation coefficients of 91% and 87% were obtained between lipolytic activity and oil in water and water in oil emulsifying activities, respectively. A correlation of 84% was obtained between lipolytic activity and reduction of surface tension in the culture medium. The surface tension decreased from 50 to 28mNm(-1) indicating that biosurfactants were produced in the culture medium. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Bioprocess iterative batch-to-batch optimization based on hybrid parametric/nonparametric models.

    Science.gov (United States)

    Teixeira, Ana P; Clemente, João J; Cunha, António E; Carrondo, Manuel J T; Oliveira, Rui

    2006-01-01

    This paper presents a novel method for iterative batch-to-batch dynamic optimization of bioprocesses. The relationship between process performance and control inputs is established by means of hybrid grey-box models combining parametric and nonparametric structures. The bioreactor dynamics are defined by material balance equations, whereas the cell population subsystem is represented by an adjustable mixture of nonparametric and parametric models. Thus optimizations are possible without detailed mechanistic knowledge concerning the biological system. A clustering technique is used to supervise the reliability of the nonparametric subsystem during the optimization. Whenever the nonparametric outputs are unreliable, the objective function is penalized. The technique was evaluated with three simulation case studies. The overall results suggest that the convergence to the optimal process performance may be achieved after a small number of batches. The model unreliability risk constraint along with sampling scheduling are crucial to minimize the experimental effort required to attain a given process performance. In general terms, it may be concluded that the proposed method broadens the application of the hybrid parametric/nonparametric modeling technique to "newer" processes with higher potential for optimization.

  10. Simplex-based optimization of numerical and categorical inputs in early bioprocess development: Case studies in HT chromatography.

    Science.gov (United States)

    Konstantinidis, Spyridon; Titchener-Hooker, Nigel; Velayudhan, Ajoy

    2017-08-01

    Bioprocess development studies often involve the investigation of numerical and categorical inputs via the adoption of Design of Experiments (DoE) techniques. An attractive alternative is the deployment of a grid compatible Simplex variant which has been shown to yield optima rapidly and consistently. In this work, the method is combined with dummy variables and it is deployed in three case studies wherein spaces are comprised of both categorical and numerical inputs, a situation intractable by traditional Simplex methods. The first study employs in silico data and lays out the dummy variable methodology. The latter two employ experimental data from chromatography based studies performed with the filter-plate and miniature column High Throughput (HT) techniques. The solute of interest in the former case study was a monoclonal antibody whereas the latter dealt with the separation of a binary system of model proteins. The implemented approach prevented the stranding of the Simplex method at local optima, due to the arbitrary handling of the categorical inputs, and allowed for the concurrent optimization of numerical and categorical, multilevel and/or dichotomous, inputs. The deployment of the Simplex method, combined with dummy variables, was therefore entirely successful in identifying and characterizing global optima in all three case studies. The Simplex-based method was further shown to be of equivalent efficiency to a DoE-based approach, represented here by D-Optimal designs. Such an approach failed, however, to both capture trends and identify optima, and led to poor operating conditions. It is suggested that the Simplex-variant is suited to development activities involving numerical and categorical inputs in early bioprocess development. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Semi-analytic techniques for calculating bubble wall profiles

    International Nuclear Information System (INIS)

    Akula, Sujeet; Balazs, Csaba; White, Graham A.

    2016-01-01

    We present semi-analytic techniques for finding bubble wall profiles during first order phase transitions with multiple scalar fields. Our method involves reducing the problem to an equation with a single field, finding an approximate analytic solution and perturbing around it. The perturbations can be written in a semi-analytic form. We assert that our technique lacks convergence problems and demonstrate the speed of convergence on an example potential. (orig.)

  12. The Draft Genome Sequence of Clostridium sp. Strain NJ4, a Bacterium Capable of Producing Butanol from Inulin Through Consolidated Bioprocessing.

    Science.gov (United States)

    Jiang, Yujia; Lu, Jiasheng; Chen, Tianpeng; Yan, Wei; Dong, Weiliang; Zhou, Jie; Zhang, Wenming; Ma, Jiangfeng; Jiang, Min; Xin, Fengxue

    2018-05-23

    A novel butanogenic Clostridium sp. NJ4 was successfully isolated and characterized, which could directly produce relatively high titer of butanol from inulin through consolidated bioprocessing (CBP). The assembled draft genome of strain NJ4 is 4.09 Mp, containing 3891 encoded protein sequences with G+C content of 30.73%. Among these annotated genes, a levanase, a hypothetical inulinase, and two bifunctional alcohol/aldehyde dehydrogenases (AdhE) were found to play key roles in the achievement of ABE production from inulin through CBP.

  13. A fast and systematic procedure to develop dynamic models of bioprocesses: application to microalgae cultures

    Directory of Open Access Journals (Sweden)

    J. Mailier

    2010-09-01

    Full Text Available The purpose of this paper is to report on the development of a procedure for inferring black-box, yet biologically interpretable, dynamic models of bioprocesses based on sets of measurements of a few external components (biomass, substrates, and products of interest. The procedure has three main steps: (a the determination of the number of macroscopic biological reactions linking the measured components; (b the estimation of a first reaction scheme, which has interesting mathematical properties, but might lack a biological interpretation; and (c the "projection" (or transformation of this reaction scheme onto a biologically-consistent scheme. The advantage of the method is that it allows the fast prototyping of models for the culture of microorganisms that are not well documented. The good performance of the third step of the method is demonstrated by application to an example of microalgal culture.

  14. Contributions of depth filter components to protein adsorption in bioprocessing.

    Science.gov (United States)

    Khanal, Ohnmar; Singh, Nripen; Traylor, Steven J; Xu, Xuankuo; Ghose, Sanchayita; Li, Zheng J; Lenhoff, Abraham M

    2018-04-16

    Depth filtration is widely used in downstream bioprocessing to remove particulate contaminants via depth straining and is therefore applied to harvest clarification and other processing steps. However, depth filtration also removes proteins via adsorption, which can contribute variously to impurity clearance and to reduction in product yield. The adsorption may occur on the different components of the depth filter, that is, filter aid, binder, and cellulose filter. We measured adsorption of several model proteins and therapeutic proteins onto filter aids, cellulose, and commercial depth filters at pH 5-8 and ionic strengths filter component in the adsorption of proteins with different net charges, using confocal microscopy. Our findings show that a complete depth filter's maximum adsorptive capacity for proteins can be estimated by its protein monolayer coverage values, which are of order mg/m 2 , depending on the protein size. Furthermore, the extent of adsorption of different proteins appears to depend on the nature of the resin binder and its extent of coating over the depth filter surface, particularly in masking the cation-exchanger-like capacity of the siliceous filter aids. In addition to guiding improved depth filter selection, the findings can be leveraged in inspiring a more intentional selection of components and design of depth filter construction for particular impurity removal targets. © 2018 Wiley Periodicals, Inc.

  15. Meta-analytic framework for sparse K-means to identify disease subtypes in multiple transcriptomic studies.

    Science.gov (United States)

    Huo, Zhiguang; Ding, Ying; Liu, Silvia; Oesterreich, Steffi; Tseng, George

    Disease phenotyping by omics data has become a popular approach that potentially can lead to better personalized treatment. Identifying disease subtypes via unsupervised machine learning is the first step towards this goal. In this paper, we extend a sparse K -means method towards a meta-analytic framework to identify novel disease subtypes when expression profiles of multiple cohorts are available. The lasso regularization and meta-analysis identify a unique set of gene features for subtype characterization. An additional pattern matching reward function guarantees consistent subtype signatures across studies. The method was evaluated by simulations and leukemia and breast cancer data sets. The identified disease subtypes from meta-analysis were characterized with improved accuracy and stability compared to single study analysis. The breast cancer model was applied to an independent METABRIC dataset and generated improved survival difference between subtypes. These results provide a basis for diagnosis and development of targeted treatments for disease subgroups.

  16. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  17. Downstream bioprocess characterisation within microfluidic devices

    DEFF Research Database (Denmark)

    Marques, Marco; Krühne, Ulrich; Szita, Nicolas

    2016-01-01

    developed which has, to some extent, hindered their implementation as early process development tools. Microfluidic devices are particularly attractive for using fewer resources, for having the possibility of parallelisation and for requiring fewer mechanical manipulations. The expectation...... is that these devices will facilitate the rapid definition of critical process parameters, and thus ultimately reduce production costs. We have developed several microfluidic mDUOs and combined them with advanced and novel analytical approaches, resulting in devices that can potentially be employed for both analytical...... for the liquid–liquid extraction of pharmaceuticals, for the purification and concentration of drug delivery vehicles, and for the flocculation of yeast cells in microfluidic devices. For the latter, we will present for the first time the capability to study flocculation-growth independent from the floc breakage...

  18. Computing the zeros of analytic functions

    CERN Document Server

    Kravanja, Peter

    2000-01-01

    Computing all the zeros of an analytic function and their respective multiplicities, locating clusters of zeros and analytic fuctions, computing zeros and poles of meromorphic functions, and solving systems of analytic equations are problems in computational complex analysis that lead to a rich blend of mathematics and numerical analysis. This book treats these four problems in a unified way. It contains not only theoretical results (based on formal orthogonal polynomials or rational interpolation) but also numerical analysis and algorithmic aspects, implementation heuristics, and polished software (the package ZEAL) that is available via the CPC Program Library. Graduate studets and researchers in numerical mathematics will find this book very readable.

  19. Treatment of supermarket vegetable wastes to be used as alternative substrates in bioprocesses.

    Science.gov (United States)

    Díaz, Ana Isabel; Laca, Amanda; Laca, Adriana; Díaz, Mario

    2017-09-01

    Fruits and vegetables have the highest wastage rates at retail and consumer levels. These wastes have promising potential for being used as substrates in bioprocesses. However, an effective hydrolysis of carbohydrates that form these residues has to be developed before the biotransformation. In this work, vegetable wastes from supermarket (tomatoes, green peppers and potatoes) have been separately treated by acid, thermal and enzymatic hydrolysis processes in order to maximise the concentration of fermentable sugars in the final broth. For all substrates, thermal and enzymatic processes have shown to be the most effective. A new combined hydrolysis procedure including these both treatments was also assayed and the enzymatic step was successfully modelled. With this combined hydrolysis, the percentage of reducing sugars extracted was increased, in comparison with the amount extracted from non-hydrolysed samples, approximately by 30% in the case of tomato and green peeper wastes. For potato wastes this percentage increased from values lower than 1% to 77%. In addition, very low values of fermentation inhibitors were found in the final broth. Copyright © 2017. Published by Elsevier Ltd.

  20. Biogas production from organic wastes in suspended cell cultures and in biofilms

    International Nuclear Information System (INIS)

    Simenonov, I.; Chorukova, E.; Mamatarkova, V.; Nikolov, L.

    2010-01-01

    The results of a comparative study of two biogas production bioprocess systems are presented. The systems submitted to comparison are based on the suspended cells cultures and the biofilm formed on solid inert support. A comprehensive research concept is formulated and discussed. It includes the main considerations regarding the choice of substrate, bioagent as mixed microbial society, type of bioreactors, regimes of functioning, analytical determinations and method of comparison. The main requirements for efficient experimental activity in comparative investigations are formulated. Their satisfaction can grant correctness of the experimental design and data acquisition. On this basis the key parameter of comparison of the two systems is defined as the specific productivity of the bioprocess systems. Under these conditions series of preliminary experiments are carried out for testing the readiness of experimental set ups for long time stable functioning and monitoring devices capabilities to maintain the bioprocess parameters at the determined intervals. These tests grant continuous incessant experimentation with the investigated bioprocess systems. The results obtained show that biofilm bioprocess systems possess up to two and half time higher specific productivity in comparison with the bioprocess systems with the suspended cells. Some visions about the future developments of comparative research on the influence of additional parameters like the mixer rotation steed, organic loads, and higher values of dilution rates are outlined.

  1. Elm Tree (Ulmus parvifolia) Bark Bioprocessed with Mycelia of Shiitake (Lentinus edodes) Mushrooms in Liquid Culture: Composition and Mechanism of Protection against Allergic Asthma in Mice.

    Science.gov (United States)

    Kim, Sung Phil; Lee, Sang Jong; Nam, Seok Hyun; Friedman, Mendel

    2016-02-03

    Mushrooms can break down complex plant materials into smaller, more digestible and bioactive compounds. The present study investigated the antiasthma effect of an Ulmus parvifolia bark extract bioprocessed in Lentinus edodes liquid mycelium culture (BPUBE) against allergic asthma in chicken egg ovalbumin (OVA)-sensitized/challenged mice. BPUBE suppressed total IgE release from U266B1 cells in a dose-dependent manner without cytotoxicity. Inhibitory activity of BPUBE against OVA-specific IgE secretion in bronchoalveolar lavage fluid (BALF) was observed in OVA-sensitized/challenged asthmatic mice. BPUBE also inhibited OVA-specific IgG and IgG1 secretion into serum from the allergic mice, suggesting the restoration of a Th2-biased immune reaction to a Th1/Th2-balanced status, as indicated by the Th1/Th2 as well as regulatory T cell (Treg) cytokine profile changes caused by BPUBE in serum or BALF. Inflammatory cell counts in BALF and lung histology showed that leukocytosis and eosinophilia induced by OVA-sensitization/challenge were inhibited by the oral administration of BPUBE. Amelioration of eosinophil infiltration near the trachea was associated with reduced eotaxin and vascular cell adhesion molecule-1 (VCAM-1) levels. Changes in proinflammatory mediator levels in BALF suggest that BPUBE decreased OVA-sensitization-induced elevation of leukotriene C4 (LTC4) and prostaglandin D2 (PGD2). The finding that asthma-associated biomarker levels of OVA-sensitized/challenged mice were much more inhibited with BPUBE treatment than NPUBE (not-bioprocessed Ulmus parvifolia extract) treatment suggested the production of new bioactive compounds by the mushroom mycelia that may be involved in enhancing the observed antiasthmatic properties. The possible relation of the composition determined by proximate analysis and GC/MS to observed bioactivity is discussed. The results suggest that the elm tree (Ulmus parvifolia) bark bioprocessed with mycelia of shiitake (Lentinus edodes

  2. Effect of exercise interventions on perceived fatigue in people with multiple sclerosis: synthesis of meta-analytic reviews.

    Science.gov (United States)

    Safari, Reza; Van der Linden, Marietta L; Mercer, Tom H

    2017-06-01

    Although exercise training has been advocated as a nonpharmacological treatment for multiple sclerosis (MS) related fatigue, no consensus exists regarding its effectiveness. To address this, we collated meta-analytic reviews that explored the effectiveness of exercise training for the treatment of MS-related fatigue. We searched five online databases for relevant reviews, published since 2005, and identified 172 records. Five reviews were retained for systematic extraction of information and evidence quality analysis. Although our review synthesis indicated that exercise training interventions have a moderate effect on fatigue reduction in people with MS, no clear insight was obtained regarding the relative effectiveness of specific types or modes of exercise intervention. Moreover, Grading of Recommendation Assessment, Development and Evaluation revealed that the overall quality of evidence emanating from these five reviews was 'very low'.

  3. Writing analytic element programs in Python.

    Science.gov (United States)

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  4. Learning analytics approach of EMMA project

    NARCIS (Netherlands)

    Tammets, Kairit; Brouns, Francis

    2014-01-01

    The EMMA project provides a MOOC platform to aggregate and delivers massive open online courses (MOOC) in multiple languages from a variety of European universities. Learning analytics play an important role in MOOCs to support the individual needs of the learner.

  5. Bioprocessing of low-level radioactive and mixed hazard wastes

    International Nuclear Information System (INIS)

    Stoner, D.L.

    1990-01-01

    Biologically-based treatment technologies are currently being developed at the Idaho National Engineering Laboratory (INEL) to aid in volume reduction and/or reclassification of low-level radioactive and mixed hazardous wastes prior to processing for disposal. The approaches taken to treat low-level radioactive and mixed wastes will reflect the physical (e.g., liquid, solid, slurry) and chemical (inorganic and/or organic) nature of the waste material being processed. Bioprocessing utilizes the diverse metabolic and biochemical characteristics of microorganisms. The application of bioadsorption and bioflocculation to reduce the volume of low-level radioactive waste are strategies comparable to the use of ion-exchange resins and coagulants that are currently used in waste reduction processes. Mixed hazardous waste would require organic as well as radionuclide treatment processes. Biodegradation of organic wastes or bioemulsification could be used in conjunction with radioisotope bioadsorption methods to treat mixed hazardous radioactive wastes. The degradation of the organic constituents of mixed wastes can be considered an alternative to incineration, while the use of bioemulsification may simply be used as a means to separate inorganic and organics to enable reclassification of wastes. The proposed technology base for the biological treatment of low-level radioactive and mixed hazardous waste has been established. Biodegradation of a variety of organic compounds that are typically found in mixed hazardous wastes has been demonstrated, degradative pathways determined and the nutritional requirements of the microorganisms are understood. Accumulation, adsorption and concentration of heavy and transition metal species and transuranics by microorganisms is widely recognized. Work at the INEL focuses on the application of demonstrated microbial transformations to process development

  6. A two-stage bioprocess for hydrogen and methane production from rice straw bioethanol residues.

    Science.gov (United States)

    Cheng, Hai-Hsuan; Whang, Liang-Ming; Wu, Chao-Wei; Chung, Man-Chien

    2012-06-01

    This study evaluates a two-stage bioprocess for recovering hydrogen and methane while treating organic residues of fermentative bioethanol from rice straw. The obtained results indicate that controlling a proper volumetric loading rate, substrate-to-biomass ratio, or F/M ratio is important to maximizing biohydrogen production from rice straw bioethanol residues. Clostridium tyrobutyricum, the identified major hydrogen-producing bacteria enriched in the hydrogen bioreactor, is likely utilizing lactate and acetate for biohydrogen production. The occurrence of acetogenesis during biohydrogen fermentation may reduce the B/A ratio and lead to a lower hydrogen production. Organic residues remained in the effluent of hydrogen bioreactor can be effectively converted to methane with a rate of 2.8 mmol CH(4)/gVSS/h at VLR of 4.6 kg COD/m(3)/d. Finally, approximately 75% of COD in rice straw bioethanol residues can be removed and among that 1.3% and 66.1% of COD can be recovered in the forms of hydrogen and methane, respectively. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Analytical Solutions of Ionic Diffusion and Heat Conduction in Multilayered Porous Media

    Directory of Open Access Journals (Sweden)

    Yu Bai

    2015-01-01

    Full Text Available Ionic diffusion and heat conduction in a multiple layered porous medium have many important engineering applications. One of the examples is the chloride ions from deicers penetrating into concrete structures such as bridge decks. Different overlays can be placed on top of concrete surface to slowdown the chloride penetration. In this paper, the chloride ion diffusion equations were established for concrete structures with multiple layers of protective system. By using Laplace transformation, an analytical solution was developed first for chloride concentration profiles in two-layered system and then extended to multiple layered systems with nonconstant boundary conditions, including the constant boundary and linear boundary conditions. Because ionic diffusion in saturated media and heat conduction are governed by the same form of partial differential equations with different materials parameters, the analytical solution was further extended to handle heat conduction in a multiple layered system under nonconstant boundary conditions. The numerical results were compared with available test data. The basic trends of the analytical solution and the test data agreed quite well.

  8. The Relational Impact of Multiple Sclerosis: An Integrative Review of the Literature Using a Cognitive Analytic Framework.

    Science.gov (United States)

    Blundell Jones, Joanna; Walsh, Sue; Isaac, Claire

    2017-12-01

    This integrative literature review uses cognitive analytic therapy (CAT) theory to examine the impact of a chronic illness, multiple sclerosis (MS), on relationships and mental health. Electronic searches were conducted in six medical and social science databases. Thirty-eight articles met inclusion criteria, and also satisfied quality criteria. Articles revealed that MS-related demands change care needs and alter relationships. Using a CAT framework, the MS literature was analysed, and five key patterns of relating to oneself and to others were identified. A diagrammatic formulation is proposed that interconnects these patterns with wellbeing and suggests potential "exits" to improve mental health, for example, assisting families to minimise overprotection. Application of CAT analysis to the literature clarifies relational processes that may affect mental health among individuals with MS, which hopefully will inform how services assist in reducing unhelpful patterns and improve coping. Further investigation of the identified patterns is needed.

  9. Advanced Technology Section semiannual progress report, April 1-September 30, 1977. Volume 1. Biotechnology and environmental programs. [Lead Abstract

    Energy Technology Data Exchange (ETDEWEB)

    Pitt, W.W. Jr.; Mrochek, J.E. (comps.)

    1980-06-01

    Research efforts in six areas are reported. They include: centrifugal analyzer development; advanced analytical systems; environmental research; bioengineering research;bioprocess development and demonstration; and, environmental control technology. Individual abstracts were prepared for each section for ERA/EDB. (JCB)

  10. Analytical Evaluation of the Performance of Proportional Fair Scheduling in OFDMA-Based Wireless Systems

    Directory of Open Access Journals (Sweden)

    Mohamed H. Ahmed

    2012-01-01

    Full Text Available This paper provides an analytical evaluation of the performance of proportional fair (PF scheduling in Orthogonal Frequency-Division Multiple Access (OFDMA wireless systems. OFDMA represents a promising multiple access scheme for transmission over wireless channels, as it combines the orthogonal frequency division multiplexing (OFDM modulation and subcarrier allocation. On the other hand, the PF scheduling is an efficient resource allocation scheme with good fairness characteristics. Consequently, OFDMA with PF scheduling represents an attractive solution to deliver high data rate services to multiple users simultaneously with a high degree of fairness. We investigate a two-dimensional (time slot and frequency subcarrier PF scheduling algorithm for OFDMA systems and evaluate its performance analytically and by simulations. We derive approximate closed-form expressions for the average throughput, throughput fairness index, and packet delay. Computer simulations are used for verification. The analytical results agree well with the results from simulations, which show the good accuracy of the analytical expressions.

  11. Bitwise dimensional co-clustering for analytical workloads

    NARCIS (Netherlands)

    S. Baumann (Stephan); P.A. Boncz (Peter); K.-U. Sattler

    2016-01-01

    htmlabstractAnalytical workloads in data warehouses often include heavy joins where queries involve multiple fact tables in addition to the typical star-patterns, dimensional grouping and selections. In this paper we propose a new processing and storage framework called Bitwise Dimensional

  12. 1991 Second international symposium on the biological processing of coal: Proceedings

    International Nuclear Information System (INIS)

    1991-09-01

    This symposium was held to aid in the advancement of science and technology in the area of coal bioprocessing by facilitating the exchange of technical information and offering a forum for open discussion and review. The symposium was complemented by four workshops which introduced the attendees to the fundamentals of genetic, mass ampersand energy balances, process ampersand economic analysis, and advanced analytical techniques as they pertain to bioprocessing of coal. Eleven countries were represented, as were numerous universities, national laboratories, federal agencies and corporations. Topics discussed include desulfurization, coal dissolution, gene cloning, and enzyme activity. Individual projects are processed separately on the databases

  13. Development of microorganisms for cellulose-biofuel consolidated bioprocessings: metabolic engineers’ tricks

    Directory of Open Access Journals (Sweden)

    Roberto Mazzoli

    2012-10-01

    Full Text Available Cellulose waste biomass is the most abundant and attractive substrate for "biorefinery strategies" that are aimed to produce high-value products (e.g. solvents, fuels, building blocks by economically and environmentally sustainable fermentation processes. However, cellulose is highly recalcitrant to biodegradation and its conversion by biotechnological strategies currently requires economically inefficient multistep industrial processes. The need for dedicated cellulase production continues to be a major constraint to cost-effective processing of cellulosic biomass.Research efforts have been aimed at developing recombinant microorganisms with suitable characteristics for single step biomass fermentation (consolidated bioprocessing, CBP. Two paradigms have been applied for such, so far unsuccessful, attempts: a “native cellulolytic strategies”, aimed at conferring high-value product properties to natural cellulolytic microorganisms; b “recombinant cellulolytic strategies”, aimed to confer cellulolytic ability to microorganisms exhibiting high product yields and titers.By starting from the description of natural enzyme systems for plant biomass degradation and natural metabolic pathways for some of the most valuable product (i.e. butanol, ethanol, and hydrogen biosynthesis, this review describes state-of-the-art bottlenecks and solutions for the development of recombinant microbial strains for cellulosic biofuel CBP by metabolic engineering. Complexed cellulases (i.e. cellulosomes benefit from stronger proximity effects and show enhanced synergy on insoluble substrates (i.e. crystalline cellulose with respect to free enzymes. For this reason, special attention was held on strategies involving cellulosome/designer cellulosome-bearing recombinant microorganisms.

  14. Analytical Computation of Information Rate for MIMO Channels

    Directory of Open Access Journals (Sweden)

    Jinbao Zhang

    2017-01-01

    Full Text Available Information rate for discrete signaling constellations is significant. However, the computational complexity makes information rate rather difficult to analyze for arbitrary fading multiple-input multiple-output (MIMO channels. An analytical method is proposed to compute information rate, which is characterized by considerable accuracy, reasonable complexity, and concise representation. These features will improve accuracy for performance analysis with criterion of information rate.

  15. Bitwise dimensional co-clustering for analytical workloads

    NARCIS (Netherlands)

    Baumann, Stephan; Boncz, Peter; Sattler, Kai Uwe

    2016-01-01

    Analytical workloads in data warehouses often include heavy joins where queries involve multiple fact tables in addition to the typical star-patterns, dimensional grouping and selections. In this paper we propose a new processing and storage framework called bitwise dimensional co-clustering (BDCC)

  16. Micro-Electromechanical Affinity Sensor for the Monitoring of Glucose in Bioprocess Media

    Directory of Open Access Journals (Sweden)

    Lorenz Theuer

    2017-06-01

    Full Text Available An affinity-viscometry-based micro-sensor probe for continuous glucose monitoring was investigated with respect to its suitability for bioprocesses. The sensor operates with glucose and dextran competing as binding partner for concanavalin A, while the viscosity of the assay scales with glucose concentration. Changes in viscosity are determined with a micro-electromechanical system (MEMS in the measurement cavity of the sensor probe. The study aimed to elucidate the interactions between the assay and a typical phosphate buffered bacterial cultivation medium. It turned out that contact with the medium resulted in a significant long-lasting drift of the assay’s viscosity at zero glucose concentration. Adding glucose to the medium lowers the drift by a factor of eight. The cglc values measured off-line with the glucose sensor for monitoring of a bacterial cultivation were similar to the measurements with an enzymatic assay with a difference of less than ±0.15 g·L−1. We propose that lectin agglomeration, the electro-viscous effect, and constitutional changes of concanavalin A due to exchanges of the incorporated metal ions may account for the observed viscosity increase. The study has demonstrated the potential of the MEMS sensor to determine sensitive viscosity changes within very small sample volumes, which could be of interest for various biotechnological applications.

  17. Additive Biotech-Chances, challenges, and recent applications of additive manufacturing technologies in biotechnology.

    Science.gov (United States)

    Krujatz, Felix; Lode, Anja; Seidel, Julia; Bley, Thomas; Gelinsky, Michael; Steingroewer, Juliane

    2017-10-25

    The diversity and complexity of biotechnological applications are constantly increasing, with ever expanding ranges of production hosts, cultivation conditions and measurement tasks. Consequently, many analytical and cultivation systems for biotechnology and bioprocess engineering, such as microfluidic devices or bioreactors, are tailor-made to precisely satisfy the requirements of specific measurements or cultivation tasks. Additive manufacturing (AM) technologies offer the possibility of fabricating tailor-made 3D laboratory equipment directly from CAD designs with previously inaccessible levels of freedom in terms of structural complexity. This review discusses the historical background of these technologies, their most promising current implementations and the associated workflows, fabrication processes and material specifications, together with some of the major challenges associated with using AM in biotechnology/bioprocess engineering. To illustrate the great potential of AM, selected examples in microfluidic devices, 3D-bioprinting/biofabrication and bioprocess engineering are highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    Science.gov (United States)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the

  19. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    Science.gov (United States)

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  20. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  1. service line analytics in the new era.

    Science.gov (United States)

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  2. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    Science.gov (United States)

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  3. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  4. Weyl Group Multiple Dirichlet Series Type A Combinatorial Theory (AM-175)

    CERN Document Server

    Brubaker, Ben; Friedberg, Solomon

    2011-01-01

    Weyl group multiple Dirichlet series are generalizations of the Riemann zeta function. Like the Riemann zeta function, they are Dirichlet series with analytic continuation and functional equations, having applications to analytic number theory. By contrast, these Weyl group multiple Dirichlet series may be functions of several complex variables and their groups of functional equations may be arbitrary finite Weyl groups. Furthermore, their coefficients are multiplicative up to roots of unity, generalizing the notion of Euler products. This book proves foundational results about these series an

  5. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  6. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    Science.gov (United States)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  7. Oscillations and Multiple Equilibria in Microvascular Blood Flow.

    Science.gov (United States)

    Karst, Nathaniel J; Storey, Brian D; Geddes, John B

    2015-07-01

    We investigate the existence of oscillatory dynamics and multiple steady-state flow rates in a network with a simple topology and in vivo microvascular blood flow constitutive laws. Unlike many previous analytic studies, we employ the most biologically relevant models of the physical properties of whole blood. Through a combination of analytic and numeric techniques, we predict in a series of two-parameter bifurcation diagrams a range of dynamical behaviors, including multiple equilibria flow configurations, simple oscillations in volumetric flow rate, and multiple coexistent limit cycles at physically realizable parameters. We show that complexity in network topology is not necessary for complex behaviors to arise and that nonlinear rheology, in particular the plasma skimming effect, is sufficient to support oscillatory dynamics similar to those observed in vivo.

  8. The emergence of Clostridium thermocellum as a high utility candidate for consolidated bioprocessing applications

    Directory of Open Access Journals (Sweden)

    Arthur eRagauskas

    2014-08-01

    Full Text Available First isolated in 1926, Clostridium thermocellum has recently received increased attention as a high utility candidate for use in consolidated bioprocessing applications. These applications, which seek to process lignocellulosic biomass directly into useful products such as ethanol, are gaining traction as economically feasible routes towards the production of fuel and other high value chemical compounds as the shortcomings of fossil fuels become evident. This review evaluates C. thermocellum’s role in this transitory process by highlighting recent discoveries relating to its genomic, transcriptomic, proteomic, and metabolomic responses to varying biomass sources, with a special emphasis placed on providing an overview of its unique, multivariate enzyme cellulosome complex and the role that this structure performs during biomass degradation. Both naturally evolved and genetically engineered strains are examined in light of their unique attributes and responses to various biomass treatment conditions, and the genetic tools that have been employed for their creation are presented. Several future routes for potential industrial usage are presented, and it is concluded that, although there have been many advances to significantly improve C. thermocellum’s amenability to industrial use, several hurdles still remain to be overcome as this unique organism enjoys increased attention within the scientific community.

  9. Nonlinear ordinary differential equations analytical approximation and numerical methods

    CERN Document Server

    Hermann, Martin

    2016-01-01

    The book discusses the solutions to nonlinear ordinary differential equations (ODEs) using analytical and numerical approximation methods. Recently, analytical approximation methods have been largely used in solving linear and nonlinear lower-order ODEs. It also discusses using these methods to solve some strong nonlinear ODEs. There are two chapters devoted to solving nonlinear ODEs using numerical methods, as in practice high-dimensional systems of nonlinear ODEs that cannot be solved by analytical approximate methods are common. Moreover, it studies analytical and numerical techniques for the treatment of parameter-depending ODEs. The book explains various methods for solving nonlinear-oscillator and structural-system problems, including the energy balance method, harmonic balance method, amplitude frequency formulation, variational iteration method, homotopy perturbation method, iteration perturbation method, homotopy analysis method, simple and multiple shooting method, and the nonlinear stabilized march...

  10. An analytic method for S-expansion involving resonance and reduction

    Energy Technology Data Exchange (ETDEWEB)

    Ipinza, M.C.; Penafiel, D.M. [Departamento de Fisica, Universidad de Concepcion (Chile); DISAT, Politecnico di Torino (Italy); Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Torino (Italy); Lingua, F. [DISAT, Politecnico di Torino (Italy); Ravera, L. [DISAT, Politecnico di Torino (Italy); Istituto Nazionale di Fisica Nucleare (INFN), Sezione di Torino (Italy)

    2016-11-15

    In this paper we describe an analytic method able to give the multiplication table(s) of the set(s) involved in an S-expansion process (with either resonance or 0{sub S}-resonant-reduction) for reaching a target Lie (super)algebra from a starting one, after having properly chosen the partitions over subspaces of the considered (super)algebras. This analytic method gives us a simple set of expressions to find the subset decomposition of the set(s) involved in the process. Then, we use the information coming from both the initial (super)algebra and the target one for reaching the multiplication table(s) of the mentioned set(s). Finally, we check associativity with an auxiliary computational algorithm, in order to understand whether the obtained set(s) can describe semigroup(s) or just abelian set(s) connecting two (super)algebras. We also give some interesting examples of application, which check and corroborate our analytic procedure and also generalize some result already presented in the literature. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. Mediation Analysis with Multiple Mediators

    OpenAIRE

    VanderWeele, T.J.; Vansteelandt, S.

    2014-01-01

    Recent advances in the causal inference literature on mediation have extended traditional approaches to direct and indirect effects to settings that allow for interactions and non-linearities. In this paper, these approaches from causal inference are further extended to settings in which multiple mediators may be of interest. Two analytic approaches, one based on regression and one based on weighting are proposed to estimate the effect mediated through multiple mediators and the effects throu...

  12. An analytical solution for improved HIFU SAR estimation

    International Nuclear Information System (INIS)

    Dillon, C R; Vyas, U; Christensen, D A; Roemer, R B; Payne, A

    2012-01-01

    Accurate determination of the specific absorption rates (SARs) present during high intensity focused ultrasound (HIFU) experiments and treatments provides a solid physical basis for scientific comparison of results among HIFU studies and is necessary to validate and improve SAR predictive software, which will improve patient treatment planning, control and evaluation. This study develops and tests an analytical solution that significantly improves the accuracy of SAR values obtained from HIFU temperature data. SAR estimates are obtained by fitting the analytical temperature solution for a one-dimensional radial Gaussian heating pattern to the temperature versus time data following a step in applied power and evaluating the initial slope of the analytical solution. The analytical method is evaluated in multiple parametric simulations for which it consistently (except at high perfusions) yields maximum errors of less than 10% at the center of the focal zone compared with errors up to 90% and 55% for the commonly used linear method and an exponential method, respectively. For high perfusion, an extension of the analytical method estimates SAR with less than 10% error. The analytical method is validated experimentally by showing that the temperature elevations predicted using the analytical method's SAR values determined for the entire 3D focal region agree well with the experimental temperature elevations in a HIFU-heated tissue-mimicking phantom. (paper)

  13. Seamless Digital Environment – Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-01

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.

  14. Bio-Refineries Bioprocess Technologies for Waste-Water Treatment, Energy and Product Valorization

    Science.gov (United States)

    Keith Cowan, A.

    2010-04-01

    Increasing pressure is being exerted on communities and nations to source energy from forms other than fossil fuels. Also, potable water is becoming a scarce resource in many parts of the world, and there remains a large divide in the demand and utilization of plant products derived from genetically modified organisms (GMOs) and non-GMOs. The most extensive user and manager of terrestrial ecosystems is agriculture which is also the de facto steward of natural resources. As stated by Miller (2008) no other industry or institution comes close to the comparative advantage held for this vital responsibility while simultaneously providing food, fiber, and other biology-based products, including energy. Since modern commercial agriculture is transitioning from the production of bulk commodities to the provision of standardized products and specific-attribute raw materials for differentiated markets, we can argue that processes such as mass cultivation of microalgae and the concept of bio-refineries be seen as part of a `new' agronomy. EBRU is currently exploring the integration of bioprocess technologies using microalgae as biocatalysts to achieve waste-water treatment, water polishing and endocrine disruptor (EDC) removal, sustainable energy production, and exploitation of the resultant biomass in agriculture as foliar fertilizer and seed coatings, and for commercial extraction of bulk commodities such as bio-oils and lecithin. This presentation will address efforts to establish a fully operational solar-driven microalgae bio-refinery for use not only in waste remediation but to transform waste and biomass to energy, fuels, and other useful materials (valorisation), with particular focus on environmental quality and sustainability goals.

  15. Bioprocessing papaya processing waste for potential aquaculture feed supplement--economic and nutrient analysis with shrimp feeding trial.

    Science.gov (United States)

    Kang, H Y; Yang, P Y; Dominy, W G; Lee, C S

    2010-10-01

    Papaya processing waste (PPW), a major fruit processing waste in the Hawaii islands, served as substrate for yeast (Saccharomyces cerevisiae) growth. The fermented PPW products containing nutrients of 45% crude protein and various fat, fiber, lignin, cellulose, and minerals were advantages to nutrients of yeast alone. Three experimental diets controlled at 35% protein formulation containing different levels of inclusion of PPW products and a commercial control diet were fed to shrimps for 8 weeks. The 50% inclusion of PPW diets were comparable to commercial feed in weight, growth, feed conversion ratio (FCR) and survival rate. Such bioprocess treatment system would be economically feasible with the control of annual cost and increase of the amount of PPW treated. The selling price of PPW products and annual operation and maintenance cost were the most influential factors to additional profits. This study presented a promising alternative for environmental-friendly treatment of organic wastes as well as the sustainability of local agriculture and aquaculture industries. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Semi-Analytical Benchmarks for MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-11-07

    Code verification is an extremely important process that involves proving or disproving the validity of code algorithms by comparing them against analytical results of the underlying physics or mathematical theory on which the code is based. Monte Carlo codes such as MCNP6 must undergo verification and testing upon every release to ensure that the codes are properly simulating nature. Specifically, MCNP6 has multiple sets of problems with known analytic solutions that are used for code verification. Monte Carlo codes primarily specify either current boundary sources or a volumetric fixed source, either of which can be very complicated functions of space, energy, direction and time. Thus, most of the challenges with modeling analytic benchmark problems in Monte Carlo codes come from identifying the correct source definition to properly simulate the correct boundary conditions. The problems included in this suite all deal with mono-energetic neutron transport without energy loss, in a homogeneous material. The variables that differ between the problems are source type (isotropic/beam), medium dimensionality (infinite/semi-infinite), etc.

  17. At-line process analytical technology (PAT) for more efficient scale up of biopharmaceutical microfiltration unit operations.

    Science.gov (United States)

    Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I

    2016-01-01

    Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.

  18. Next generation industrial biotechnology based on extremophilic bacteria.

    Science.gov (United States)

    Chen, Guo-Qiang; Jiang, Xiao-Ran

    2018-04-01

    Industrial biotechnology aims to produce bulk chemicals including polymeric materials and biofuels based on bioprocessing sustainable agriculture products such as starch, fatty acids and/or cellulose. However, traditional bioprocesses require bioreactors made of stainless steel, complicated sterilization, difficult and expensive separation procedures as well as well-trained engineers that are able to conduct bioprocessing under sterile conditions, reducing the competitiveness of the bio-products. Amid the continuous low petroleum price, next generation industrial biotechnology (NGIB) allows bioprocessing to be conducted under unsterile (open) conditions using ceramic, cement or plastic bioreactors in a continuous way, it should be an energy, water and substrate saving technology with convenient operation procedure. NGIB also requires less capital investment and reduces demand on highly trained engineers. The foundation for the simplified NGIB is microorganisms that resist contaminations by other microbes, one of the examples is rapid growing halophilic bacteria inoculated under high salt concentration and alkali pH. They have been engineered to produce multiple products in various scales. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Mathematical Modeling of Loop Heat Pipes with Multiple Capillary Pumps and Multiple Condensers. Part 1; Stead State Stimulations

    Science.gov (United States)

    Hoang, Triem T.; OConnell, Tamara; Ku, Jentung

    2004-01-01

    Loop Heat Pipes (LHPs) have proven themselves as reliable and robust heat transport devices for spacecraft thermal control systems. So far, the LHPs in earth-orbit satellites perform very well as expected. Conventional LHPs usually consist of a single capillary pump for heat acquisition and a single condenser for heat rejection. Multiple pump/multiple condenser LHPs have shown to function very well in ground testing. Nevertheless, the test results of a dual pump/condenser LHP also revealed that the dual LHP behaved in a complicated manner due to the interaction between the pumps and condensers. Thus it is redundant to say that more research is needed before they are ready for 0-g deployment. One research area that perhaps compels immediate attention is the analytical modeling of LHPs, particularly the transient phenomena. Modeling a single pump/single condenser LHP is difficult enough. Only a handful of computer codes are available for both steady state and transient simulations of conventional LHPs. No previous effort was made to develop an analytical model (or even a complete theory) to predict the operational behavior of the multiple pump/multiple condenser LHP systems. The current research project offered a basic theory of the multiple pump/multiple condenser LHP operation. From it, a computer code was developed to predict the LHP saturation temperature in accordance with the system operating and environmental conditions.

  20. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  1. Data analytics in the ATLAS Distributed Computing

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2015-01-01

    The ATLAS Data analytics effort is focused on creating systems which provide the ATLAS ADC with new capabilities for understanding distributed systems and overall operational performance. These capabilities include: warehousing information from multiple systems (the production and distributed analysis system - PanDA, the distributed data management system - Rucio, the file transfer system, various monitoring services etc. ); providing a platform to execute arbitrary data mining and machine learning algorithms over aggregated data; satisfy a variety of use cases for different user roles; host new third party analytics services on a scalable compute platform. We describe the implemented system where: data sources are existing RDBMS (Oracle) and Flume collectors; a Hadoop cluster is used to store the data; native Hadoop and Apache Pig scripts are used for data aggregation; and R for in-depth analytics. Part of the data is indexed in ElasticSearch so both simpler investigations and complex dashboards can be made ...

  2. Multiple Site Damage in Flat Panel Testing

    National Research Council Canada - National Science Library

    Shrage, Daniel

    2000-01-01

    This report aimed to experimentally verify analytical models that predict the residual strength of representative aircraft structures, such as wide panels, that are subjected to Multiple Site Damage (MSD...

  3. Detection of sensor failures in nuclear plants using analytic redundancy

    International Nuclear Information System (INIS)

    Kitamura, M.

    1980-01-01

    A method for on-line, nonperturbative detection and identification of sensor failures in nuclear power plants was studied to determine its feasibility. This method is called analytic redundancy, or functional redundancy. Sensor failure has traditionally been detected by comparing multiple signals from redundant sensors, such as in two-out-of-three logic. In analytic redundancy, with the help of an assumed model of the physical system, the signals from a set of sensors are processed to reproduce the signals from all system sensors

  4. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  5. Process relevant screening of cellulolytic organisms for consolidated bioprocessing.

    Science.gov (United States)

    Antonov, Elena; Schlembach, Ivan; Regestein, Lars; Rosenbaum, Miriam A; Büchs, Jochen

    2017-01-01

    Although the biocatalytic conversion of cellulosic biomass could replace fossil oil for the production of various compounds, it is often not economically viable due to the high costs of cellulolytic enzymes. One possibility to reduce costs is consolidated bioprocessing (CBP), integrating cellulase production, hydrolysis of cellulose, and the fermentation of the released sugars to the desired product into one process step. To establish such a process, the most suitable cellulase-producing organism has to be identified. Thereby, it is crucial to evaluate the candidates under target process conditions. In this work, the chosen model process was the conversion of cellulose to the platform chemical itaconic acid by a mixed culture of a cellulolytic fungus with Aspergillus terreus as itaconic acid producer. Various cellulase producers were analyzed by the introduced freeze assay that measures the initial carbon release rate, quantifying initial cellulase activity under target process conditions. Promising candidates were then characterized online by monitoring their respiration activity metabolizing cellulose to assess the growth and enzyme production dynamics. The screening of five different cellulase producers with the freeze assay identified Trichoderma   reesei and Penicillium   verruculosum as most promising. The measurement of the respiration activity revealed a retarded induction of cellulase production for P.   verruculosum but a similar cellulase production rate afterwards, compared to T.   reesei . The freeze assay measurement depicted that P.   verruculosum reaches the highest initial carbon release rate among all investigated cellulase producers. After a modification of the cultivation procedure, these results were confirmed by the respiration activity measurement. To compare both methods, a correlation between the measured respiration activity and the initial carbon release rate of the freeze assay was introduced. The analysis revealed that the

  6. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    Science.gov (United States)

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  7. Analytic expressions for the construction of a fire event PSA model

    International Nuclear Information System (INIS)

    Kang, Dae Il; Kim, Kil Yoo; Kim, Dong San; Hwang, Mee Jeong; Yang, Joon Eon

    2016-01-01

    In this study, the changing process of an internal event PSA model to a fire event PSA model is analytically presented and discussed. Many fire PSA models have fire induced initiating event fault trees not shown in an internal event PSA model. Fire-induced initiating fault tree models are developed for addressing multiple initiating event issues. A single fire event within a fire compartment or fire scenario can cause multiple initiating events. As an example, a fire in a turbine building area can cause a loss of the main feed-water and loss of off-site power initiating events. Up to now, there has been no analytic study on the construction of a fire event PSA model using an internal event PSA model with fault trees of initiating events. In this paper, the changing process of an internal event PSA model to a fire event PSA model was analytically presented and discussed. This study results show that additional cutsets can be obtained if the fault trees of initiating events for a fire event PSA model are not exactly developed.

  8. Analytic expressions for the construction of a fire event PSA model

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kil Yoo; Kim, Dong San; Hwang, Mee Jeong; Yang, Joon Eon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, the changing process of an internal event PSA model to a fire event PSA model is analytically presented and discussed. Many fire PSA models have fire induced initiating event fault trees not shown in an internal event PSA model. Fire-induced initiating fault tree models are developed for addressing multiple initiating event issues. A single fire event within a fire compartment or fire scenario can cause multiple initiating events. As an example, a fire in a turbine building area can cause a loss of the main feed-water and loss of off-site power initiating events. Up to now, there has been no analytic study on the construction of a fire event PSA model using an internal event PSA model with fault trees of initiating events. In this paper, the changing process of an internal event PSA model to a fire event PSA model was analytically presented and discussed. This study results show that additional cutsets can be obtained if the fault trees of initiating events for a fire event PSA model are not exactly developed.

  9. Microbial ecology of fermentative hydrogen producing bioprocesses: useful insights for driving the ecosystem function.

    Science.gov (United States)

    Cabrol, Lea; Marone, Antonella; Tapia-Venegas, Estela; Steyer, Jean-Philippe; Ruiz-Filippi, Gonzalo; Trably, Eric

    2017-03-01

    One of the most important biotechnological challenges is to develop environment friendly technologies to produce new sources of energy. Microbial production of biohydrogen through dark fermentation, by conversion of residual biomass, is an attractive solution for short-term development of bioH2 producing processes. Efficient biohydrogen production relies on complex mixed communities working in tight interaction. Species composition and functional traits are of crucial importance to maintain the ecosystem service. The analysis of microbial community revealed a wide phylogenetic diversity that contributes in different-and still mostly unclear-ways to hydrogen production. Bridging this gap of knowledge between microbial ecology features and ecosystem functionality is essential to optimize the bioprocess and develop strategies toward a maximization of the efficiency and stability of substrate conversion. The aim of this review is to provide a comprehensive overview of the most up-to-date biodata available and discuss the main microbial community features of biohydrogen engineered ecosystems, with a special emphasis on the crucial role of interactions and the relationships between species composition and ecosystem service. The elucidation of intricate relationships between community structure and ecosystem function would make possible to drive ecosystems toward an improved functionality on the basis of microbial ecology principles. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Enhanced Bioethanol Production from Potato Peel Waste Via Consolidated Bioprocessing with Statistically Optimized Medium.

    Science.gov (United States)

    Hossain, Tahmina; Miah, Abdul Bathen; Mahmud, Siraje Arif; Mahin, Abdullah-Al-

    2018-04-12

    In this study, an extensive screening was undertaken to isolate some amylolytic microorganisms capable of producing bioethanol from starchy biomass through Consolidated Bioprocessing (CBP). A total of 28 amylolytic microorganisms were isolated, from which 5 isolates were selected based on high α-amylase and glucoamylase activities and identified as Candida wangnamkhiaoensis, Hyphopichia pseudoburtonii (2 isolates), Wickerhamia sp., and Streptomyces drozdowiczii based on 26S rDNA and 16S rDNA sequencing. Wickerhamia sp. showed the highest ethanol production (30.4 g/L) with fermentation yield of 0.3 g ethanol/g starch. Then, a low cost starchy waste, potato peel waste (PPW) was used as a carbon source to produce ethanol by Wickerhamia sp. Finally, in order to obtain maximum ethanol production from PPW, a fermentation medium was statistically designed. The effect of various medium ingredients was evaluated initially by Plackett-Burman design (PBD), where malt extracts, tryptone, and KH 2 PO 4 showed significantly positive effect (p value < 0.05). Using Response Surface Modeling (RSM), 40 g/L (dry basis) PPW and 25 g/L malt extract were found optimum and yielded 21.7 g/L ethanol. This study strongly suggests Wickerhamia sp. as a promising candidate for bioethanol production from starchy biomass, in particular, PPW through CBP.

  11. Multiplication: From Thales to Lie1

    Indian Academy of Sciences (India)

    Addition. To describe the geometric constructions of addition, as ..... general, we could apply the implicit function theorem of calculus to solve locally the defining ... and whose multiplication and inverse are analytic maps, is called a Lie group.

  12. Description of JNC's analytical method and its performance for FBR cores

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2000-01-01

    The description of JNC's analytical method and its performance for FBR cores includes: an outline of JNC's Analytical System Compared with ERANOS; a standard data base for FBR Nuclear Design in JNC; JUPITER Critical Experiment; details of Analytical Method and Its Effects on JUPITER; performance of JNC Analytical System (effective multiplication factor k eff , control rod worth, and sodium void reactivity); design accuracy of a 600 MWe-class FBR Core. JNC developed a consistent analytical system for FBR core evaluation, based on JENDL library, f-table method, and three dimensional diffusion/transport theory, which includes comprehensive sensitivity tools to improve the prediction accuracy of core parameters. JNC system was verified by analysis of JUPITER critical experiment, and other facilities. Its performance can be judged quite satisfactory for FBR-core design work, though there is room for further improvement, such as more detailed treatment of cross-section resonance regions

  13. Theoretical analysis of multiple quantum-well, slow-light devices under applied external fields using a fully analytical model in fractional dimension

    Energy Technology Data Exchange (ETDEWEB)

    Kohandani, R; Kaatuzian, H [Photonics Research Laboratory, Electrical Engineering Department, AmirKabir University of Technology, Hafez Ave., Tehran (Iran, Islamic Republic of)

    2015-01-31

    We report a theoretical study of optical properties of AlGaAs/GaAs multiple quantum-well (MQW), slow-light devices based on excitonic population oscillations under applied external magnetic and electric fields using an analytical model for complex dielectric constant of Wannier excitons in fractional dimension. The results are shown for quantum wells (QWs) of different width. The significant characteristics of the exciton in QWs such as exciton energy and exciton oscillator strength (EOS) can be varied by application of external magnetic and electric fields. It is found that a higher bandwidth and an appropriate slow-down factor (SDF) can be achieved by changing the QW width during the fabrication process and by applying magnetic and electric fields during device functioning, respectively. It is shown that a SDF of 10{sup 5} is obtained at best. (slowing of light)

  14. An analytical turn-on power loss model for 650-V GaN eHEMTs

    DEFF Research Database (Denmark)

    Shen, Yanfeng; Wang, Huai; Shen, Zhan

    2018-01-01

    This paper proposes an improved analytical turn-on power loss model for 650-V GaN eHEMTs. The static characteristics, i.e., the parasitic capacitances and transconductance, are firstly modeled. Then the turn-on process is divided into multiple stages and analyzed in detail; as results, the time-d......-domain solutions to the drain-source voltage and drain current are obtained. Finally, double-pulse tests are conducted to verify the proposed power loss model. This analytical model enables an accurate and fast switching behavior characterization and power loss prediction.......This paper proposes an improved analytical turn-on power loss model for 650-V GaN eHEMTs. The static characteristics, i.e., the parasitic capacitances and transconductance, are firstly modeled. Then the turn-on process is divided into multiple stages and analyzed in detail; as results, the time...

  15. Consolidated bioprocessing of microalgal biomass to carboxylates by a mixed culture of cow rumen bacteria using anaerobic sequencing batch reactor (ASBR).

    Science.gov (United States)

    Zhao, Baisuo; Liu, Jie; Frear, Craig; Holtzapple, Mark; Chen, Shulin

    2016-12-01

    This study employed mixed-culture consolidated bioprocessing (CBP) to digest microalgal biomass in an anaerobic sequencing batch reactor (ASBR). The primary objectives are to evaluate the impact of hydraulic residence time (HRT) on the productivity of carboxylic acids and to characterize the bacterial community. HRT affects the production rate and patterns of carboxylic acids. For the 5-L laboratory-scale fermentation, a 12-day HRT was selected because it offered the highest productivity of carboxylic acids and it synthesized longer chains. The variability of the bacterial community increased with longer HRT (R 2 =0.85). In the 5-L laboratory-scale fermentor, the most common phyla were Firmicutes (58.3%), Bacteroidetes (27.4%), and Proteobacteria (11.9%). The dominant bacterial classes were Clostridia (29.8%), Bacteroidia (27.4%), Tissierella (26.2%), and Betaproteobacteria (8.9%). Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. High-precision improved-analytic-exponentiation results for multiple-photon effects in low-angle Bhabha scattering at the SLAC Linear Collider and the CERN e+e- collider LEP

    International Nuclear Information System (INIS)

    Jadach, S.; Richter-Was, E.; Ward, B.F.L.; Was, Z.

    1991-01-01

    Starting from an earlier benchmark analytical calculation of the luminosity process e + e-→e + e-+(γ) at the SLAC Linear Collider (SLC) and the CERN e + e- collider LEP, we use the methods of Yennie, Frautschi, and Suura to develop an analytical improved naive exponentiated formula for this process. The formula is compared to our multiple-photon Monte Carlo event generator BHLUMI (1.13) for the same process. We find agreement on the overall cross-section normalization between the exponentiated formula and BHLUMI below the 0.2% level. In this way, we obtain an important cross-check on the normalization of our higher-order results in BHLUMI and we arrive at formulas which represent the LEP/SLC luminosity process in the below 1% Z 0 physics tests of the SU(2) L xU(1) theory in complete analogy with the famous high-precision Z 0 line-shape formulas for the e + e-→μ + μ - process discussed by Berends et al., for example

  17. Semi-analytic calculations for the impact parameter dependence of electromagnetic multi-lepton pair production

    International Nuclear Information System (INIS)

    Gueclue, M.C.

    2000-01-01

    We provide a new general semi-analytic derivation of the impact parameter dependence of lowest order electromagnetic lepton-pair production in relativistic heavy-ion collisions. By using this result we have also calculated the related analytic multiple-pair production in the two-photon external-field model. We have compared our results with the equivalent-photon approximation and other calculations

  18. Linear systems and multiplicity of ideals

    International Nuclear Information System (INIS)

    Le Dung Trang

    2008-06-01

    Using a geometric interpretation of the multiplicity, we give a geometric way to calculate the multiplicity. We consider the particular case of a non-singular complex surface and give an example with a geometric proof of a result. Most of this note is written in the language of complex analytic spaces, but the results can be stated and proved in the case of schemes of finite type over an infinite field with equi-characteristic local rings

  19. Big data and high-performance analytics in structural health monitoring for bridge management

    Science.gov (United States)

    Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed

    2016-04-01

    Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.

  20. A novel ionic liquid-tolerant Fusarium oxysporum BN secreting ionic liquid-stable cellulase: consolidated bioprocessing of pretreated lignocellulose containing residual ionic liquid.

    Science.gov (United States)

    Xu, Jiaxing; Wang, Xinfeng; Hu, Lei; Xia, Jun; Wu, Zhen; Xu, Ning; Dai, Benlin; Wu, Bin

    2015-04-01

    In this study, microbial communities from chemicals polluted microhabitats were cultured with the addition of imidazolium-based ionic liquid (IL) to enrich for IL-tolerant microbes. A strain of Fusarium oxysporum BN producing cellulase from these enrichments was capable of growing in 10% (w/v) 1-ethyl-3-methylimidazolium phosphinate, much higher than the normal IL concentrations in the lignocellulose regenerated from ILs. Cellulase secreted by the strain showed high resistance to ILs based on phosphate and sulfate radicals, evidencing of a high conformational stability in relevant media. Gratifyingly, F. oxysporum BN can directly convert IL-pretreated rice straw to bioethanol via consolidated bioprocessing (I-CBP). At optimum fermentation condition, a maximum ethanol yield of 0.125 g ethanol g(-1) of rice straw was finally obtained, corresponding to 64.2% of the theoretical yield. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A systematic review of factors associated with accidental falls in people with multiple sclerosis: a meta-analytic approach.

    Science.gov (United States)

    Giannì, Costanza; Prosperini, Luca; Jonsdottir, Johanna; Cattaneo, Davide

    2014-07-01

    To determine whether there are demographic, clinical, and instrumental variables useful to detect fall status of patients with multiple sclerosis. PubMed and the Cochrane Library. Eligible studies were identified by two independent investigators. Only studies having a clear distinction between fallers and non-fallers were included and meta-analysed. Odds ratios (ORs) and standard mean differences (SMDs) were calculated and pooled using fixed effect models. Among 115 screened articles, 15 fulfilled criteria for meta-analyses, with a total of 2425 patients included. Proportion of fallers may vary from 30% to 63% in a time frame from 1 to 12 months. No significant publication bias was found, even though 12/15 studies relied on retrospective reports of falls, thus introducing recall biases. Risk factors for falls varied across studies, owing to heterogeneity of populations included and clinical instruments used. The meta-analytic approach found that, compared with non-fallers, fallers had longer disease duration (SMD = 0.14, p = 0.02), progressive course of disease (OR = 2.02, p < 0.0001), assistive device for walking (OR = 3.16, p < 0.0001), greater overall disability level (SMD = 0.74, p < 0.0001), slower walking speed (SMD = 0.45, p = 0.0005), and worse performances in balance tests (Berg Balance Scale: SMD = -0.48, p = 0.002; Timed up-and-go test, SMD = 0.31, p = 0.04), and force-platform measures (postural sway) with eyes opened (SMD = 0.71, p = 0.006) and closed (SMD = 0.83, p = 0.01), respectively. Elucidations regarding risk factors for accidental falls in patients with multiple sclerosis (PwMs) are provided here, with worse disability score, progressive course, use of walking aid, and poorer performances in static and dynamic balance tests strongly associated with fall status. © The Author(s) 2014.

  2. Bio-processing of Agro-industrial Wastes for Production of Food-grade Enzymes: Progress and Prospects

    Directory of Open Access Journals (Sweden)

    Parmjit S Panesar

    2016-10-01

    Full Text Available Background and Objectives: In the era of global industrialization, enzymes are being used extensively in the various sectors including food processing. Owing to the high price of enzymes, various initiatives have been undertaken by the R&D sector for the development of new processes or improvement in the existing processes for production of cost effective enzymes. With the advancement in the field of biotechnology, different bioprocesses are being used for utilization of different agro-industrial residues for the production of various enzymes. This review focuses on different types of agro-industrial wastes and their utilization in the production of enzymes. The present scenario as well as the future scope of utilization of enzymes in the food industry has also been discussed.Results and Conclusion: The regulations from the various governmental as well as environmental agencies for the demand of cleaner environment have led to the advancement in various technologies for utilization of the wastes for the production of value-added products such as enzymes. Among the different types of fermentation, maximum work has been carried under solid state conditions by batch fermentation. The research has indicated the significant potential of agro-industrial wastes for production of food-grade enzymes in order to improve the economics of the process.Conflict of interests: The authors declare no conflict of interest.

  3. Allogeneic cell therapy bioprocess economics and optimization: single-use cell expansion technologies.

    Science.gov (United States)

    Simaria, Ana S; Hassan, Sally; Varadaraju, Hemanthram; Rowley, Jon; Warren, Kim; Vanek, Philip; Farid, Suzanne S

    2014-01-01

    For allogeneic cell therapies to reach their therapeutic potential, challenges related to achieving scalable and robust manufacturing processes will need to be addressed. A particular challenge is producing lot-sizes capable of meeting commercial demands of up to 10(9) cells/dose for large patient numbers due to the current limitations of expansion technologies. This article describes the application of a decisional tool to identify the most cost-effective expansion technologies for different scales of production as well as current gaps in the technology capabilities for allogeneic cell therapy manufacture. The tool integrates bioprocess economics with optimization to assess the economic competitiveness of planar and microcarrier-based cell expansion technologies. Visualization methods were used to identify the production scales where planar technologies will cease to be cost-effective and where microcarrier-based bioreactors become the only option. The tool outputs also predict that for the industry to be sustainable for high demand scenarios, significant increases will likely be needed in the performance capabilities of microcarrier-based systems. These data are presented using a technology S-curve as well as windows of operation to identify the combination of cell productivities and scale of single-use bioreactors required to meet future lot sizes. The modeling insights can be used to identify where future R&D investment should be focused to improve the performance of the most promising technologies so that they become a robust and scalable option that enables the cell therapy industry reach commercially relevant lot sizes. The tool outputs can facilitate decision-making very early on in development and be used to predict, and better manage, the risk of process changes needed as products proceed through the development pathway. © 2013 Wiley Periodicals, Inc.

  4. An approximate analytical approach to resampling averages

    DEFF Research Database (Denmark)

    Malzahn, Dorthe; Opper, M.

    2004-01-01

    Using a novel reformulation, we develop a framework to compute approximate resampling data averages analytically. The method avoids multiple retraining of statistical models on the samples. Our approach uses a combination of the replica "trick" of statistical physics and the TAP approach for appr...... for approximate Bayesian inference. We demonstrate our approach on regression with Gaussian processes. A comparison with averages obtained by Monte-Carlo sampling shows that our method achieves good accuracy....

  5. PROVIDING PLANT DATA ANALYTICS THROUGH A SEAMLESS DIGITAL ENVIRONMENT

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna

    2017-06-01

    As technology continues to evolve and become more integrated into a worker’s daily routine in the Nuclear Power industry the need for easy access to data becomes a priority. Not only does the need for data increase but the amount of data collected increases. In most cases the data is collected and stored in various software applications, many of which are legacy systems, which do not offer any other option to access the data except through the application’s user interface. Furthermore the data gets grouped in “silos” according to work function and not necessarily by subject. Hence, in order to access all the information needed for a particular task or analysis one may have to access multiple applications to gather all the data needed. The industry and the research community have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment. An SDE provides a means to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. In addition, the nuclear utilities have identified the need for research focused on data analytics. The effort should develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. Idaho National Laboratory is leading such effort, which is conducted in close collaboration with vendors, nuclear utilities, Institute of Nuclear Power Operations, and Electric Power Research Institute. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight

  6. Visual Analytics for Heterogeneous Geoscience Data

    Science.gov (United States)

    Pan, Y.; Yu, L.; Zhu, F.; Rilee, M. L.; Kuo, K. S.; Jiang, H.; Yu, H.

    2017-12-01

    Geoscience data obtained from diverse sources have been routinely leveraged by scientists to study various phenomena. The principal data sources include observations and model simulation outputs. These data are characterized by spatiotemporal heterogeneity originated from different instrument design specifications and/or computational model requirements used in data generation processes. Such inherent heterogeneity poses several challenges in exploring and analyzing geoscience data. First, scientists often wish to identify features or patterns co-located among multiple data sources to derive and validate certain hypotheses. Heterogeneous data make it a tedious task to search such features in dissimilar datasets. Second, features of geoscience data are typically multivariate. It is challenging to tackle the high dimensionality of geoscience data and explore the relations among multiple variables in a scalable fashion. Third, there is a lack of transparency in traditional automated approaches, such as feature detection or clustering, in that scientists cannot intuitively interact with their analysis processes and interpret results. To address these issues, we present a new scalable approach that can assist scientists in analyzing voluminous and diverse geoscience data. We expose a high-level query interface that allows users to easily express their customized queries to search features of interest across multiple heterogeneous datasets. For identified features, we develop a visualization interface that enables interactive exploration and analytics in a linked-view manner. Specific visualization techniques such as scatter plots to parallel coordinates are employed in each view to allow users to explore various aspects of features. Different views are linked and refreshed according to user interactions in any individual view. In such a manner, a user can interactively and iteratively gain understanding into the data through a variety of visual analytics operations. We

  7. SPANDOM - source projection analytic nodal discrete ordinates method

    International Nuclear Information System (INIS)

    Kim, Tae Hyeong; Cho, Nam Zin

    1994-01-01

    We describe a new discrete ordinates nodal method for the two-dimensional transport equation. We solve the discrete ordinates equation analytically after the source term is projected and represented in polynomials. The method is applied to two fast reactor benchmark problems and compared with the TWOHEX code. The results indicate that the present method accurately predicts not only multiplication factor but also flux distribution

  8. MULTIPLE PERSONALITY: CASE REPORT STUDY

    Directory of Open Access Journals (Sweden)

    Miloš Židanik

    2004-07-01

    Full Text Available Background. Multiple personality disorder is characterised by splited individual ego-states and splited professional community arguing whether this disorder actually exists or not.Methods. In this case report study a supportive psychodynamic psychotherapy of a patient with multiple personality disorder is presented, that lasted for 4.5 years and resulted in ego-reintegration.Conclusions. The spliting between different ego-states is powered by unneutralised aggression with the possibility of hetero- and autoaggressive behaviour. Therefore the patient in the analytically oriented psychotherapeutic process is at high risk and a safe therapeutic (e. g. in-patient setting has to be provided.

  9. Bioprocess design guided by in situ substrate supply and product removal: process intensification for synthesis of (S)-1-(2-chlorophenyl)ethanol.

    Science.gov (United States)

    Schmölzer, Katharina; Mädje, Katharina; Nidetzky, Bernd; Kratzer, Regina

    2012-03-01

    We report herein on bioprocess development guided by the hydrophobicities of substrate and product. Bioreductions of o-chloroacetophenone are severely limited by instability of the catalyst in the presence of aromatic substrate and (S)-1-(2-chlorophenyl)ethanol. In situ substrate supply and product removal was used to protect the utilized Escherichia coli whole cell catalyst based on Candida tenuis xylose reductase during the reaction. Further engineering at the levels of the catalyst and the reaction media was matched to low substrate concentrations in the aqueous phase. Productivities obtained in aqueous batch reductions were 21-fold improved by addition of 20% (v/v) hexane, NAD(+), expression engineering, cell permeabilization and pH optimization. Reduction of 300 mM substrate was accomplished in 97% yield and use of the co-solvent hexane in subsequent extraction steps led to 88% recovery. Product loss due to high catalyst loading was minimized by using the same extractant in bioreduction and product isolation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Multiple photon resonances

    International Nuclear Information System (INIS)

    Elliott, C.J.; Feldman, B.J.

    1979-02-01

    A detailed theoretical analysis is presented of the interaction of intense near-resonant monochromatic radiation with an N-level anharmonic oscillator. In particular, the phenomenon of multiple photon resonance, the process by which an N-level system resonantly absorbs two or more photons simultaneously, is investigated. Starting from the Schroedinger equation, diagrammatic techniques are developed that allow the resonant process to be analyzed quantitatively, in analogy with well-known two-level coherent phenomena. In addition, multiple photon Stark shifts of the resonances, shifts absent in two-level theory, are obtained from the diagrams. Insights into the nature of multiple photon resonances are gained by comparing the quantum mechanical system with classical coupled pendulums whose equations of motion possess identical eigenvalues and eigenvectors. In certain limiting cases, including that of the resonantly excited N-level harmonic oscillator and that of the equally spaced N-level system with equal matrix elements, analytic results are derived. The influence of population relaxation and phase-disrupting collisions on the multiple photon process are also analyzed, the latter by extension of the diagrammatic technique to the density matrix equations of motion. 11 figures

  11. Analytic results for planar three-loop integrals for massive form factors

    Energy Technology Data Exchange (ETDEWEB)

    Henn, Johannes M. [PRISMA Cluster of Excellence, Johannes Gutenberg Universität Mainz,55099 Mainz (Germany); Kavli Institute for Theoretical Physics, UC Santa Barbara,Santa Barbara (United States); Smirnov, Alexander V. [Research Computing Center, Moscow State University,119992 Moscow (Russian Federation); Smirnov, Vladimir A. [Skobeltsyn Institute of Nuclear Physics of Moscow State University,119992 Moscow (Russian Federation); Institut für Theoretische Teilchenphysik, Karlsruhe Institute of Technology (KIT),76128 Karlsruhe (Germany)

    2016-12-28

    We use the method of differential equations to analytically evaluate all planar three-loop Feynman integrals relevant for form factor calculations involving massive particles. Our results for ninety master integrals at general q{sup 2} are expressed in terms of multiple polylogarithms, and results for fiftyone master integrals at the threshold q{sup 2}=4m{sup 2} are expressed in terms of multiple polylogarithms of argument one, with indices equal to zero or to a sixth root of unity.

  12. Enabling analytics on sensitive medical data with secure multi-party computation

    NARCIS (Netherlands)

    M. Veeningen (Meilof); S. Chatterjea (Supriyo); A.Z. Horváth (Anna Zsófia); G. Spindler (Gerald); E. Boersma (Eric); P. van der Spek (Peter); O. van der Galiën (Onno); J. Gutteling (Job); W. Kraaij (Wessel); P.J.M. Veugen (Thijs)

    2018-01-01

    textabstractWhile there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multiparty computation can enable such data

  13. Sample diagnosis using indicator elements and non-analyte signals for inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Antler, Margaret; Ying Hai; Burns, David H.; Salin, Eric D.

    2003-01-01

    A sample diagnosis procedure that uses both non-analyte and analyte signals to estimate matrix effects in inductively coupled plasma-mass spectrometry is presented. Non-analyte signals are those of background species in the plasma (e.g. N + , ArO + ), and changes in these signals can indicate changes in plasma conditions. Matrix effects of Al, Ba, Cs, K and Na on 19 non-analyte signals and 15 element signals were monitored. Multiple linear regression was used to build the prediction models, using a genetic algorithm for objective feature selection. Non-analyte elemental signals and non-analyte signals were compared for diagnosing matrix effects, and both were found to be suitable for estimating matrix effects. Individual analyte matrix effect estimation was compared with the overall matrix effect prediction, and models used to diagnose overall matrix effects were more accurate than individual analyte models. In previous work [Spectrochim. Acta Part B 57 (2002) 277], we tested models for analytical decision making. The current models were tested in the same way, and were able to successfully diagnose matrix effects with at least an 80% success rate

  14. Two-dimensional analytical solution for nodal calculation of nuclear reactors

    International Nuclear Information System (INIS)

    Silva, Adilson C.; Pessoa, Paulo O.; Silva, Fernando C.; Martinez, Aquilino S.

    2017-01-01

    Highlights: • A proposal for a coarse mesh nodal method is presented. • The proposal uses the analytical solution of the two-dimensional neutrons diffusion equation. • The solution is performed homogeneous nodes with dimensions of the fuel assembly. • The solution uses four average fluxes on the node surfaces as boundary conditions. • The results show good accuracy and efficiency. - Abstract: In this paper, the two-dimensional (2D) neutron diffusion equation is analytically solved for two energy groups (2G). The spatial domain of reactor core is divided into a set of nodes with uniform nuclear parameters. To determine iteratively the multiplication factor and the neutron flux in the reactor we combine the analytical solution of the neutron diffusion equation with an iterative method known as power method. The analytical solution for different types of regions that compose the reactor is obtained, such as fuel and reflector regions. Four average fluxes in the node surfaces are used as boundary conditions for analytical solution. Discontinuity factors on the node surfaces derived from the homogenization process are applied to maintain averages reaction rates and the net current in the fuel assembly (FA). To validate the results obtained by the analytical solution a relative power density distribution in the FAs is determined from the neutron flux distribution and compared with the reference values. The results show good accuracy and efficiency.

  15. PROGRESSIVE DATA ANALYTICS IN HEALTH INFORMATICS USING AMAZON ELASTIC MAPREDUCE (EMR

    Directory of Open Access Journals (Sweden)

    J S Shyam Mohan

    2016-04-01

    Full Text Available Identifying, diagnosing and treatment of cancer involves a thorough investigation that involves data collection called big data from multi and different sources that are helpful for making effective and quick decision making. Similarly data analytics is used to find remedial actions for newly arriving diseases spread across multiple warehouses. Analytics can be performed on collected or available data from various data clusters that contains pieces of data. We provide an effective framework that provides a way for effective decision making using Amazon EMR. Through various experiments done on different biological datasets, we reveal the advantages of the proposed model and present numerical results. These results indicate that the proposed framework can efficiently perform analytics over any biological datasets and obtain results in optimal time thereby maintaining the quality of the result.

  16. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  17. GenoSets: visual analytic methods for comparative genomics.

    Directory of Open Access Journals (Sweden)

    Aurora A Cain

    Full Text Available Many important questions in biology are, fundamentally, comparative, and this extends to our analysis of a growing number of sequenced genomes. Existing genomic analysis tools are often organized around literal views of genomes as linear strings. Even when information is highly condensed, these views grow cumbersome as larger numbers of genomes are added. Data aggregation and summarization methods from the field of visual analytics can provide abstracted comparative views, suitable for sifting large multi-genome datasets to identify critical similarities and differences. We introduce a software system for visual analysis of comparative genomics data. The system automates the process of data integration, and provides the analysis platform to identify and explore features of interest within these large datasets. GenoSets borrows techniques from business intelligence and visual analytics to provide a rich interface of interactive visualizations supported by a multi-dimensional data warehouse. In GenoSets, visual analytic approaches are used to enable querying based on orthology, functional assignment, and taxonomic or user-defined groupings of genomes. GenoSets links this information together with coordinated, interactive visualizations for both detailed and high-level categorical analysis of summarized data. GenoSets has been designed to simplify the exploration of multiple genome datasets and to facilitate reasoning about genomic comparisons. Case examples are included showing the use of this system in the analysis of 12 Brucella genomes. GenoSets software and the case study dataset are freely available at http://genosets.uncc.edu. We demonstrate that the integration of genomic data using a coordinated multiple view approach can simplify the exploration of large comparative genomic data sets, and facilitate reasoning about comparisons and features of interest.

  18. Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.

    Science.gov (United States)

    Trentelman, Karen

    2017-06-12

    Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.

  19. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    Science.gov (United States)

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  20. Analytic formula of stopping power for high energy electrons in liquid media

    International Nuclear Information System (INIS)

    Scarlat, F.; Niculescu, V.I.R.

    1994-01-01

    This article is part of a series on the calculation of high energy electron dose using multiple scattering theory. In the current article we present an analytic formula obtained for the collision stopping power (S/ρ) c and the radiative stopping power (S/ρ) r for electrons with energy within 1 MeV - 35 MeV range. For that purpose we used data given for electrons in water in NBS-IR-2550A. The analytical formulae are approximating the data calculated by Berger and Seltzer within 1-2% limit. (Author)

  1. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    Science.gov (United States)

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  2. An analytical evaluation for the pressure drop characteristics of bottom nozzle flow holes

    International Nuclear Information System (INIS)

    Yang, S. G.; Kim, H. J.; Lim, H. T.; Park, E. J.; Jeon, K. L.

    2002-01-01

    An analytical evaluation for the bottom nozzle flow holes was performed to find a best design concept in terms of pressure drop. For this analysis, Computational Fluid Dynamics (CFD), FLUENT 5.5, code was selected as an analytical evaluation tool. The applicability of CFD code was verified by benchmarking study with Vibration Investigation of Small-scale Test Assemblies (VISTA) test data in several flow conditions and typical flow hole shape. From this verification, the analytical data were benchmarked roughly within 17% to the VISTA test data. And, overall trend under various flow conditions looked very similar between both cases. Based on the evaluated results using CFD code, it is concluded that the deburring and multiple chamfer hole features at leading edge are the excellent design concept to decrease pressure drop across bottom nozzle plate. The deburring and multiple chamfer hole features at leading edge on the bottom nozzle plate have 12% and 17% pressure drop benefit against a single chamfer hole feature on the bottom nozzle plate, respectively. These design features are meaningful and applicable as a low pressure drop design concept of bottom nozzle for Pressurized Water Reactor (PWR) fuel assembly

  3. Perturbative analysis of multiple-field cosmological inflation

    International Nuclear Information System (INIS)

    Lahiri, Joydev; Bhattacharya, Gautam

    2006-01-01

    We develop a general formalism for analyzing linear perturbations in multiple-field cosmological inflation based on the gauge-ready approach. Our inflationary model consists of an arbitrary number of scalar fields with non-minimal kinetic terms. We solve the equations for scalar- and tensor-type perturbations during inflation to the first order in slow roll, and then obtain the super-horizon solutions for adiabatic and isocurvature perturbations after inflation. Analytic expressions for power-spectra and spectral indices arising from multiple-field inflation are presented

  4. Metabolite profiling of CHO cells: Molecular reflections of bioprocessing effectiveness

    NARCIS (Netherlands)

    Sellick, C.A.; Croxford, A.S.; Maqsood, A.R.; Stephens, G.M.; Westerhoff, H.V.; Goodacre, R.; Dickson, A.J.

    2015-01-01

    Whilst development of medium and feeds has provided major advances in recombinant protein production in CHO cells, the fundamental understanding is limited. We have applied metabolite profiling with established robust (GC-MS) analytics to define the molecular loci by which two yield-enhancing feeds

  5. An interlaboratory transfer of a multi-analyte assay between continents.

    Science.gov (United States)

    Georgiou, Alexandra; Dong, Kelly; Hughes, Stephen; Barfield, Matthew

    2015-01-01

    Alex has worked at GlaxoSmithKline for the past 15 years and currently works within the bioanalytical and toxicokinetic group in the United Kingdom. Alex's role in previous years has been the in-house support of preclinical and clinical bioanalysis, from method development through to sample analysis activities as well as acting as PI for GLP bioanalysis and toxicokinetics. For the past two years, Alex has applied this analytical and regulatory experience to focus on the outsourcing of preclinical bioanalysis, toxicokinetics and clinical bioanalysis, working closely with multiple bioanalytical and in-life CRO partners worldwide. Alex works to support DMPK and Safety Assessment outsourcing activities for GSK across multiple therapeutic areas, from the first GLP study through to late stage clinical PK studies. Transfer and cross-validation of an existing analytical assay between a laboratory providing current analytical support, and a laboratory needed for new or additional support, can present the bioanalyst with numerous challenges. These challenges can be technical or logistical in nature and may prove to be significant when transferring an assay between laboratories in different continents. Part of GlaxoSmithKline's strategy to improve confidence in providing quality data, is to cross-validate between laboratories. If the cross-validation fails predefined acceptance criteria, then a subsequent investigation would follow. This may also prove to be challenging. The importance of thorough planning and good communication throughout assay transfer, cross-validation and any subsequent investigations is illustrated in this case study.

  6. Anthropology and Multiple Modernities

    DEFF Research Database (Denmark)

    Thomassen, Bjørn

    “modernities” over the last 10 years, this paper wishes to address the analytical usefulness of this conceptual development. What is it about these concepts that make them useful as we try to capture the World today? Rather than providing any substantial definitions as to what those modernities are about (or...... what they are not about), anthropologists have used ethnographies to demonstrate how modernities are lived and constructed differently in different cultural contexts. To a very large extent, anthropologists intend these multiple modernities to refer to the interplay between local and global...... configurations. However, if the current pluralizing of modernity ultimately serves to describe the variety of cultural forms that co-exist in the World today, the analytical value of the concept risks being watered down, and little is gained in perspective. Arguably, other concepts would have served the purpose...

  7. An improved multiple flame photometric detector for gas chromatography.

    Science.gov (United States)

    Clark, Adrian G; Thurbide, Kevin B

    2015-11-20

    An improved multiple flame photometric detector (mFPD) is introduced, based upon interconnecting fluidic channels within a planar stainless steel (SS) plate. Relative to the previous quartz tube mFPD prototype, the SS mFPD provides a 50% reduction in background emission levels, an orthogonal analytical flame, and easier more sensitive operation. As a result, sulfur response in the SS mFPD spans 4 orders of magnitude, yields a minimum detectable limit near 9×10(-12)gS/s, and has a selectivity approaching 10(4) over carbon. The device also exhibits exceptionally large resistance to hydrocarbon response quenching. Additionally, the SS mFPD uniquely allows analyte emission monitoring in the multiple worker flames for the first time. The findings suggest that this mode can potentially further improve upon the analytical flame response of sulfur (both linear HSO, and quadratic S2) and also phosphorus. Of note, the latter is nearly 20-fold stronger in S/N in the collective worker flames response and provides 6 orders of linearity with a detection limit of about 2.0×10(-13)gP/s. Overall, the results indicate that this new SS design notably improves the analytical performance of the mFPD and can provide a versatile and beneficial monitoring tool for gas chromatography. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Analytical Chemistry Core Capability Assessment - Preliminary Report

    International Nuclear Information System (INIS)

    Barr, Mary E.; Farish, Thomas J.

    2012-01-01

    The concept of 'core capability' can be nebulous one. Even at a fairly specific level, where core capability equals maintaining essential services, it is highly dependent upon the perspective of the requestor. Samples are submitted to analytical services because the requesters do not have the capability to conduct adequate analyses themselves. Some requests are for general chemical information in support of R and D, process control, or process improvement. Many analyses, however, are part of a product certification package and must comply with higher-level customer quality assurance requirements. So which services are essential to that customer - just those for product certification? Does the customer also (indirectly) need services that support process control and improvement? And what is the timeframe? Capability is often expressed in terms of the currently utilized procedures, and most programmatic customers can only plan a few years out, at best. But should core capability consider the long term where new technologies, aging facilities, and personnel replacements must be considered? These questions, and a multitude of others, explain why attempts to gain long-term consensus on the definition of core capability have consistently failed. This preliminary report will not try to define core capability for any specific program or set of programs. Instead, it will try to address the underlying concerns that drive the desire to determine core capability. Essentially, programmatic customers want to be able to call upon analytical chemistry services to provide all the assays they need, and they don't want to pay for analytical chemistry services they don't currently use (or use infrequently). This report will focus on explaining how the current analytical capabilities and methods evolved to serve a variety of needs with a focus on why some analytes have multiple analytical techniques, and what determines the infrastructure for these analyses. This information will be

  9. Single and multiple transverse fracture initiation from horizontal wells

    Energy Technology Data Exchange (ETDEWEB)

    Crosby, D.G.; Rahman, M.M.; Rahman, M.K.; Rahman, S.S. [School of Petroleum Engineering, The University of New South Wales, 2052 Sydney (Australia)

    2002-08-01

    The results of an analytical and experimental study of the initiation of transverse fractures from horizontal wells are presented. Analytical criteria for the initiation of single hydraulic fracture are reviewed, and criterion for initiation of multiple hydraulic fractures was developed by modification of the existing Drucker and Prager criterion for single hydraulic fracture initiation. The developed criterion for multiple fracture initiation was validated by comparisons with actual hydraulic fracture initiation pressures, which were obtained from scaled laboratory experiments and numerical results from boundary element analysis. Other criteria are assessed against the experimental results. Experimentally obtained transverse fracture initiation pressures were found close to longitudinal fracture initiation pressures estimated from maximum tensile stress criterion and Hoek and Brown criterion. One possible explanation of this finding is presented. Results from Drucker and Prager criteria for single and multiple fracture initiation were, however, found closer to experimental values. Therefore, these criteria could be useful to engineers involved with hydraulic fracturing for predicting transverse fracture initiation pressures from horizontal wells drilled parallel to the minimum horizontal in-situ stress.

  10. The Earth Data Analytic Services (EDAS) Framework

    Science.gov (United States)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  11. High-performance multiple-reflection time-of-flight mass spectrometers for research with exotic nuclei and for analytical mass spectrometry

    Science.gov (United States)

    Plaß, Wolfgang R.; Dickel, Timo; Ayet San Andres, Samuel; Ebert, Jens; Greiner, Florian; Hornung, Christine; Jesch, Christian; Lang, Johannes; Lippert, Wayne; Majoros, Tamas; Short, Devin; Geissel, Hans; Haettner, Emma; Reiter, Moritz P.; Rink, Ann-Kathrin; Scheidenberger, Christoph; Yavor, Mikhail I.

    2015-11-01

    A class of multiple-reflection time-of-flight mass spectrometers (MR-TOF-MSs) has been developed for research with exotic nuclei at present and future accelerator facilities such as GSI and FAIR (Darmstadt), and TRIUMF (Vancouver). They can perform highly accurate mass measurements of exotic nuclei, serve as high-resolution, high-capacity mass separators and be employed as diagnostics devices to monitor the production, separation and manipulation of beams of exotic nuclei. In addition, a mobile high-resolution MR-TOF-MS has been developed for in situ applications in analytical mass spectrometry ranging from environmental research to medicine. Recently, the MR-TOF-MS for GSI and FAIR has been further developed. A novel RF quadrupole-based ion beam switchyard has been developed that allows merging and splitting of ion beams as well as transport of ions into different directions. It efficiently connects a test and reference ion source and an auxiliary detector to the system. Due to an increase in the kinetic energy of the ions in the time-of-flight analyzer of the MR-TOF-MS, a given mass resolving power is now achieved in less than half the time-of-flight. Conversely, depending on the time-of-flight, the mass resolving power has been increased by a factor of more than two.

  12. Analytic investigation of extended Heitler-Matthews model

    Energy Technology Data Exchange (ETDEWEB)

    Grimm, Stefan; Veberic, Darko; Engel, Ralph [KIT, IKP (Germany)

    2016-07-01

    Many features of extensive air showers are qualitatively well described by the Heitler cascade model and its extensions. The core of a shower is given by hadrons that interact with air nuclei. After each interaction some of these hadrons decay and feed the electromagnetic shower component. The most important parameters of such hadronic interactions are inelasticity, multiplicity, and the ratio of charged vs. neutral particles. However, in analytic considerations approximations are needed to include the characteristics of hadron production. We discuss extensions of the simple cascade model by analytic description of air showers by cascade models which include also the elasticity, and derive the number of produced muons. In a second step we apply this model to calculate the dependence of the shower center of gravity on model parameters. The depth of the center of gravity is closely related to that of the shower maximum, which is a commonly-used composition-sensitive observable.

  13. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  14. Jet multiplicity distributions: medium dependence in MLLA

    International Nuclear Information System (INIS)

    Armesto, Nestor; Pajares, Carlos; Quiroga-Arias, Paloma

    2009-01-01

    We study the medium dependence of the multiplicity distributions in the modified leading logarithmic approximation. We focus in the enhancement in the number of branchings as the partons travel trough a dense medium created in a heavy-ion collision. We study the effect of a higher number of splittings in some jet observables by introducing the medium as a constant (f med ) in the splitting functions. Having as our ansatz for the quark and gluon jets mean multiplicities left angle n G right angle =e γy and left angle n Q right angle =r -1 e γy , we study in an analytic approach the dependence with the medium (f med ) of the anomalous dimension (γ), the multiplicity ratio (r), and so the mean multiplicities. We also obtain the higher-order moments of the multiplicity distribution, what allows us to study its dispersion. (orig.)

  15. Learning Analytics to Support Teachers During Synchronous CSCL: Balancing Between Overview and Overload

    NARCIS (Netherlands)

    van Leeuwen, A.

    2015-01-01

    Learning analytics (LA) are summaries, visualizations, and analyses of student data that could improve learning in multiple ways, for example by supporting teachers. However, not much research is available yet concerning how LA may support teachers to diagnose student progress and to intervene

  16. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    Science.gov (United States)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  17. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  18. Complete equation of state for shocked liquid nitrogen: Analytical developments

    International Nuclear Information System (INIS)

    Winey, J. M.; Gupta, Y. M.

    2016-01-01

    The thermodynamic response of liquid nitrogen has been studied extensively, in part, due to the long-standing interest in the high pressure and high temperature dissociation of shocked molecular nitrogen. Previous equation of state (EOS) developments regarding shocked liquid nitrogen have focused mainly on the use of intermolecular pair potentials in atomistic calculations. Here, we present EOS developments for liquid nitrogen, incorporating analytical models, for use in continuum calculations of the shock compression response. The analytical models, together with available Hugoniot data, were used to extrapolate a low pressure reference EOS for molecular nitrogen [Span, et al., J. Phys. Chem. Ref. Data 29, 1361 (2000)] to high pressures and high temperatures. Using the EOS presented here, the calculated pressures and temperatures for single shock, double shock, and multiple shock compression of liquid nitrogen provide a good match to the measured results over a broad range of P-T space. Our calculations provide the first comparison of EOS developments with recently-measured P-T states under multiple shock compression. The present EOS developments are general and are expected to be useful for other liquids that have low pressure reference EOS information available.

  19. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  20. Measuring multiple residual-stress components using the contour method and multiple cuts

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Swenson, Hunter [Los Alamos National Laboratory; Pagliaro, Pierluigi [U. PALERMO; Zuccarello, Bernardo [U. PALERMO

    2009-01-01

    The conventional contour method determines one component of stress over the cross section of a part. The part is cut into two, the contour of the exposed surface is measured, and Bueckner's superposition principle is analytically applied to calculate stresses. In this paper, the contour method is extended to the measurement of multiple stress components by making multiple cuts with subsequent applications of superposition. The theory and limitations are described. The theory is experimentally tested on a 316L stainless steel disk with residual stresses induced by plastically indenting the central portion of the disk. The stress results are validated against independent measurements using neutron diffraction. The theory has implications beyond just multiple cuts. The contour method measurements and calculations for the first cut reveal how the residual stresses have changed throughout the part. Subsequent measurements of partially relaxed stresses by other techniques, such as laboratory x-rays, hole drilling, or neutron or synchrotron diffraction, can be superimposed back to the original state of the body.

  1. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  2. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  3. Using the Analytic Hierarchy Process for Decision-Making in Ecosystem Management

    Science.gov (United States)

    Daniel L. Schmoldt; David L. Peterson

    1997-01-01

    Land management activities on public lands combine multiple objectives in order to create a plan of action over a finite time horizon. Because management activities are constrained by time and money, it is critical to make the best use of available agency resources. The Analytic Hierarchy Process (AHP) offers a structure for multi-objective decisionmaking so that...

  4. Approximate analytical methods for solving ordinary differential equations

    CERN Document Server

    Radhika, TSL; Rani, T Raja

    2015-01-01

    Approximate Analytical Methods for Solving Ordinary Differential Equations (ODEs) is the first book to present all of the available approximate methods for solving ODEs, eliminating the need to wade through multiple books and articles. It covers both well-established techniques and recently developed procedures, including the classical series solution method, diverse perturbation methods, pioneering asymptotic methods, and the latest homotopy methods.The book is suitable not only for mathematicians and engineers but also for biologists, physicists, and economists. It gives a complete descripti

  5. Big Data Analytics Solutions: The Implementation Challenges in the Financial Services Industry

    Science.gov (United States)

    Ojo, Michael O.

    2016-01-01

    The challenges of Big Data (BD) and Big Data Analytics (BDA) have attracted disproportionately less attention than the overwhelmingly espoused benefits and game-changing promises. While many studies have examined BD challenges across multiple industry verticals, very few have focused on the challenges of implementing BDA solutions. Fewer of these…

  6. Multiple analytical approaches reveal distinct gene-environment interactions in smokers and non smokers in lung cancer.

    Directory of Open Access Journals (Sweden)

    Rakhshan Ihsan

    Full Text Available Complex disease such as cancer results from interactions of multiple genetic and environmental factors. Studying these factors singularly cannot explain the underlying pathogenetic mechanism of the disease. Multi-analytical approach, including logistic regression (LR, classification and regression tree (CART and multifactor dimensionality reduction (MDR, was applied in 188 lung cancer cases and 290 controls to explore high order interactions among xenobiotic metabolizing genes and environmental risk factors. Smoking was identified as the predominant risk factor by all three analytical approaches. Individually, CYP1A1*2A polymorphism was significantly associated with increased lung cancer risk (OR = 1.69;95%CI = 1.11-2.59,p = 0.01, whereas EPHX1 Tyr113His and SULT1A1 Arg213His conferred reduced risk (OR = 0.40;95%CI = 0.25-0.65,p<0.001 and OR = 0.51;95%CI = 0.33-0.78,p = 0.002 respectively. In smokers, EPHX1 Tyr113His and SULT1A1 Arg213His polymorphisms reduced the risk of lung cancer, whereas CYP1A1*2A, CYP1A1*2C and GSTP1 Ile105Val imparted increased risk in non-smokers only. While exploring non-linear interactions through CART analysis, smokers carrying the combination of EPHX1 113TC (Tyr/His, SULT1A1 213GG (Arg/Arg or AA (His/His and GSTM1 null genotypes showed the highest risk for lung cancer (OR = 3.73;95%CI = 1.33-10.55,p = 0.006, whereas combined effect of CYP1A1*2A 6235CC or TC, SULT1A1 213GG (Arg/Arg and betel quid chewing showed maximum risk in non-smokers (OR = 2.93;95%CI = 1.15-7.51,p = 0.01. MDR analysis identified two distinct predictor models for the risk of lung cancer in smokers (tobacco chewing, EPHX1 Tyr113His, and SULT1A1 Arg213His and non-smokers (CYP1A1*2A, GSTP1 Ile105Val and SULT1A1 Arg213His with testing balance accuracy (TBA of 0.6436 and 0.6677 respectively. Interaction entropy interpretations of MDR results showed non-additive interactions of tobacco chewing with

  7. Robotic platform for parallelized cultivation and monitoring of microbial growth parameters in microwell plates.

    Science.gov (United States)

    Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter

    2014-12-01

    The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.

  8. Jet multiplicity distributions: medium dependence in MLLA

    Energy Technology Data Exchange (ETDEWEB)

    Armesto, Nestor; Pajares, Carlos; Quiroga-Arias, Paloma [Universidade de Santiago de Compostela, Departamento de Fisica de Particulas and IGFAE, Santiago de Compostela (Spain)

    2009-07-15

    We study the medium dependence of the multiplicity distributions in the modified leading logarithmic approximation. We focus in the enhancement in the number of branchings as the partons travel trough a dense medium created in a heavy-ion collision. We study the effect of a higher number of splittings in some jet observables by introducing the medium as a constant (f{sub med}) in the splitting functions. Having as our ansatz for the quark and gluon jets mean multiplicities left angle n{sub G} right angle =e{sup {gamma}}{sup y} and left angle n{sub Q} right angle =r{sup -1}e{sup {gamma}}{sup y}, we study in an analytic approach the dependence with the medium (f{sub med}) of the anomalous dimension ({gamma}), the multiplicity ratio (r), and so the mean multiplicities. We also obtain the higher-order moments of the multiplicity distribution, what allows us to study its dispersion. (orig.)

  9. Comparison of PSF maxima and minima of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems

    Science.gov (United States)

    Ratnam, Challa; Lakshmana Rao, Vadlamudi; Lachaa Goud, Sivagouni

    2006-10-01

    In the present paper, and a series of papers to follow, the Fourier analytical properties of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems are investigated. First, the transmission function for MACA and CMACA is derived using Fourier methods and, based on the Fresnel-Kirchoff diffraction theory, the formulae for the point spread function are formulated. The PSF maxima and minima are calculated for both the MACA and CMACA systems. The dependence of these properties on the number of zones is studied and reported in this paper.

  10. Comparison of PSF maxima and minima of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems

    International Nuclear Information System (INIS)

    Ratnam, Challa; Rao, Vadlamudi Lakshmana; Goud, Sivagouni Lachaa

    2006-01-01

    In the present paper, and a series of papers to follow, the Fourier analytical properties of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems are investigated. First, the transmission function for MACA and CMACA is derived using Fourier methods and, based on the Fresnel-Kirchoff diffraction theory, the formulae for the point spread function are formulated. The PSF maxima and minima are calculated for both the MACA and CMACA systems. The dependence of these properties on the number of zones is studied and reported in this paper

  11. Development of a Framework for Sustainable Outsourcing: Analytic Balanced Scorecard Method (A-BSC

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-06-01

    Full Text Available Nowadays, many enterprises choose to outsource its non-core business to other enterprises to reduce cost and increase the efficiency. Many enterprises choose to outsource their supply chain management (SCM and leave it to a third-party organization in order to improve their services. The paper proposes an integrated and multicriteria tool useful to monitor and to improve performance in an outsourced supply chain. The Analytic Balanced Scorecard method (A-BSC is proposed as an effective method useful to analyze strategic performance within an outsourced supply chain. The aim of the paper is to present the integration of two methodologies: Balanced Scorecard, a multiple perspective framework for performance assessment, and Analytic Hierarchy Process, a decision-making tool used to prioritize multiple performance perspectives and to generate a unified metric. The development of the framework is aimed to provide a performance analysis to achieve better sustainability performance of supply chain. A real case study concerning a typical value chain is presented.

  12. Continuous volatile fatty acid production from lignocellulosic biomass by a novel rumen-mimetic bioprocess.

    Science.gov (United States)

    Agematu, Hitosi; Takahashi, Takehiko; Hamano, Yoshio

    2017-11-01

    Lignocellulosic biomass is an attractive source of biofuels and biochemicals, being abundant in various plant sources. However, processing this type of biomass requires hydrolysis of cellulose. The proposed rumen-mimetic bioprocess consists of dry-pulverization of lignocellulosic biomass and pH-controlled continuous cultivation of ruminal bacteria using ammonium as a nitrogen source. In this study, ruminal bacteria were continuously cultivated for over 60 days and used to digest microcrystalline cellulose, rice straw, and Japanese cedar to produce volatile fatty acids (VFAs). The ruminal bacteria grew well in the chemically defined medium. The amounts of VFAs produced from 20 g of cellulose, rice straw, and Japanese cedar were 183 ± 29.7, 69.6 ± 12.2, and 21.8 ± 12.9 mmol, respectively. Each digestion completed within 24 h. The carbon yield was 60.6% when 180 mmol of VFAs was produced from 20 g of cellulose. During the cultivation, the bacteria were observed to form flocs that enfolded the feed particles. These flocs likely contain all of the bacterial species necessary to convert lignocellulosic biomass to VFAs and microbial protein symbiotically. Denaturing gradient gel electrophoresis (DGGE) analysis of PCR-amplified 16S rDNA fragments revealed that the bacterial community was relatively stable after 1 week in cultivation, though it was different from the original community structure. Furthermore, sequence analysis of the DGGE bands indicates that the microbial community includes a cellulolytic bacterium, a bacterium acting synergistically with cellulolytic bacteria, and a propionate-producing bacterium, as well as other anaerobic bacteria. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  13. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  14. A multiscale analytical approach for bone remodeling simulations : linking scales from collagen to trabeculae

    NARCIS (Netherlands)

    Colloca, M.; Blanchard, R.; Hellmich, C.; Ito, K.; Rietbergen, van B.

    2014-01-01

    Bone is a dynamic and hierarchical porous material whose spatial and temporal mechanical properties can vary considerably due to differences in its microstructure and due to remodeling. Hence, a multiscale analytical approach, which combines bone structural information at multiple scales to the

  15. Group Decision Making with the Analytic Hierarchy Process in Benefit-Risk Assessment: A Tutorial

    NARCIS (Netherlands)

    Hummel, J. Marjan; Bridges, John; IJzerman, Maarten Joost

    2014-01-01

    The analytic hierarchy process (AHP) has been increasingly applied as a technique for multi-criteria decision analysis in healthcare. The AHP can aid decision makers in selecting the most valuable technology for patients, while taking into account multiple, and even conflicting, decision criteria.

  16. Multi-analytical Approaches Informing the Risk of Sepsis

    Science.gov (United States)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  17. A Table Lookup Method for Exact Analytical Solutions of Nonlinear Fractional Partial Differential Equations

    Directory of Open Access Journals (Sweden)

    Ji Juan-Juan

    2017-01-01

    Full Text Available A table lookup method for solving nonlinear fractional partial differential equations (fPDEs is proposed in this paper. Looking up the corresponding tables, we can quickly obtain the exact analytical solutions of fPDEs by using this method. To illustrate the validity of the method, we apply it to construct the exact analytical solutions of four nonlinear fPDEs, namely, the time fractional simplified MCH equation, the space-time fractional combined KdV-mKdV equation, the (2+1-dimensional time fractional Zoomeron equation, and the space-time fractional ZKBBM equation. As a result, many new types of exact analytical solutions are obtained including triangular periodic solution, hyperbolic function solution, singular solution, multiple solitary wave solution, and Jacobi elliptic function solution.

  18. Strategic analytics: towards fully embedding evidence in healthcare decision-making.

    Science.gov (United States)

    Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh

    2015-01-01

    Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.

  19. Comparison of three analytical methods for the determination of trace elements in whole blood

    International Nuclear Information System (INIS)

    Ward, N.I.; Stephens, R.; Ryan, D.E.

    1979-01-01

    Three different analytical techniques were compared in a study of the role of trace elements in multiple sclerosis. Data for eight elements (Cd, Co, Cr, Cu, Mg, Mn, Pb, Zn) from neutron activation, flame atomic absorption and electrothermal atomic absorption methods were compared and evaluated statistically. No difference (probability less than 0.001) was observed in the elemental values obtained. Comparison of data between suitably different analytical methods gives increased confidence in the results obtained and is of particular value when standard reference materials are not available. (Auth.)

  20. SPARTex: A Vertex-Centric Framework for RDF Data Analytics

    KAUST Repository

    Abdelaziz, Ibrahim

    2015-08-31

    A growing number of applications require combining SPARQL queries with generic graph search on RDF data. However, the lack of procedural capabilities in SPARQL makes it inappropriate for graph analytics. Moreover, RDF engines focus on SPARQL query evaluation whereas graph management frameworks perform only generic graph computations. In this work, we bridge the gap by introducing SPARTex, an RDF analytics framework based on the vertex-centric computation model. In SPARTex, user-defined vertex centric programs can be invoked from SPARQL as stored procedures. SPARTex allows the execution of a pipeline of graph algorithms without the need for multiple reads/writes of input data and intermediate results. We use a cost-based optimizer for minimizing the communication cost. SPARTex evaluates queries that combine SPARQL and generic graph computations orders of magnitude faster than existing RDF engines. We demonstrate a real system prototype of SPARTex running on a local cluster using real and synthetic datasets. SPARTex has a real-time graphical user interface that allows the participants to write regular SPARQL queries, use our proposed SPARQL extension to declaratively invoke graph algorithms or combine/pipeline both SPARQL querying and generic graph analytics.

  1. An integrated bio-process for production of functional biomolecules utilizing raw and by-products from dairy and sugarcane industries.

    Science.gov (United States)

    Lata, Kusum; Sharma, Manisha; Patel, Satya Narayan; Sangwan, Rajender S; Singh, Sudhir P

    2018-04-21

    The study investigated an integrated bioprocessing of raw and by-products from sugarcane and dairy industries for production of non-digestible prebiotic and functional ingredients. The low-priced feedstock, whey, molasses, table sugar, jaggery, etc., were subjected to transglucosylation reactions catalyzed by dextransucrase from Leuconostoc mesenteroides MTCC 10508. HPLC analysis approximated production of about 11-14 g L -1 trisaccharide i.e. 2-α-D-glucopyranosyl-lactose (4-galactosyl-kojibiose) from the feedstock prepared from table sugar, jaggery, cane molasses and liquid whey, containing about 30 g L -1 sucrose and lactose each. The trisaccharide was hydrolysed into the prebiotic disaccharide, kojibiose, by employing recombinant β-galactosidase from Escherichia coli. The enzyme β-galactosidase achieved about 90% conversion of 2-α-D-glucopyranosyl-lactose into kojibiose. The D-fructose generated by catalytic reactions of dextransucrase was targeted for catalytic transformation into rare sugar, D-allulose (or D-psicose), by treating the samples with Smt3-D-psicose 3-epimerase. The catalytic reactions resulted in the conversion of ~ 25% D-fructose to D-allulose. These bioactive compounds are known to exert a plethora of benefits to human health, and therefore, are preferred ingredients for making functional foods.

  2. Rapid Analysis of Carbohydrates in Bioprocess Samples: An Evaluation of the CarboPac SA10 for HPAE-PAD Analysis by Interlaboratory Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Sevcik, R. S.; Hyman, D. A.; Basumallich, L.; Scarlata, C. J.; Rohrer, J.; Chambliss, C. K.

    2013-01-01

    A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymatic saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.

  3. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  4. Analytical transition-matrix treatment of electric multipole polarizabilities of hydrogen-like atoms

    International Nuclear Information System (INIS)

    Kharchenko, V.F.

    2015-01-01

    The direct transition-matrix approach to the description of the electric polarization of the quantum bound system of particles is used to determine the electric multipole polarizabilities of the hydrogen-like atoms. It is shown that in the case of the bound system formed by the Coulomb interaction the corresponding inhomogeneous integral equation determining an off-shell scattering function, which consistently describes virtual multiple scattering, can be solved exactly analytically for all electric multipole polarizabilities. Our method allows to reproduce the known Dalgarno–Lewis formula for electric multipole polarizabilities of the hydrogen atom in the ground state and can also be applied to determine the polarizability of the atom in excited bound states. - Highlights: • A new description for electric polarization of hydrogen-like atoms. • Expression for multipole polarizabilities in terms of off-shell scattering functions. • Derivation of integral equation determining the off-shell scattering function. • Rigorous analytic solving the integral equations both for ground and excited states. • Study of contributions of virtual multiple scattering to electric polarizabilities

  5. Analytic model for the long-term evolution of circular Earth satellite orbits including lunar node regression

    Science.gov (United States)

    Zhu, Ting-Lei; Zhao, Chang-Yin; Zhang, Ming-Jiang

    2017-04-01

    This paper aims to obtain an analytic approximation to the evolution of circular orbits governed by the Earth's J2 and the luni-solar gravitational perturbations. Assuming that the lunar orbital plane coincides with the ecliptic plane, Allan and Cook (Proc. R. Soc. A, Math. Phys. Eng. Sci. 280(1380):97, 1964) derived an analytic solution to the orbital plane evolution of circular orbits. Using their result as an intermediate solution, we establish an approximate analytic model with lunar orbital inclination and its node regression be taken into account. Finally, an approximate analytic expression is derived, which is accurate compared to the numerical results except for the resonant cases when the period of the reference orbit approximately equals the integer multiples (especially 1 or 2 times) of lunar node regression period.

  6. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  7. Thermotolerant Kluyveromyces marxianus and Saccharomyces cerevisiae strains representing potentials for bioethanol production from Jerusalem artichoke by consolidated bioprocessing.

    Science.gov (United States)

    Hu, Nan; Yuan, Bo; Sun, Juan; Wang, Shi-An; Li, Fu-Li

    2012-09-01

    Thermotolerant inulin-utilizing yeast strains are desirable for ethanol production from Jerusalem artichoke tubers by consolidated bioprocessing (CBP). To obtain such strains, 21 naturally occurring yeast strains isolated by using an enrichment method and 65 previously isolated Saccharomyces cerevisiae strains were investigated in inulin utilization, extracellular inulinase activity, and ethanol fermentation from inulin and Jerusalem artichoke tuber flour at 40 °C. The strains Kluyveromyces marxianus PT-1 (CGMCC AS2.4515) and S. cerevisiae JZ1C (CGMCC AS2.3878) presented the highest extracellular inulinase activity and ethanol yield in this study. The highest ethanol concentration in Jerusalem artichoke tuber flour fermentation (200 g L(-1)) at 40 °C achieved by K. marxianus PT-1 and S. cerevisiae JZ1C was 73.6 and 65.2 g L(-1), which corresponded to the theoretical ethanol yield of 90.0 and 79.7 %, respectively. In the range of 30 to 40 °C, temperature did not have a significant effect on ethanol production for both strains. This study displayed the distinctive superiority of K. marxianus PT-1 and S. cerevisiae JZ1C in the thermotolerance and utilization of inulin-type oligosaccharides reserved in Jerusalem artichoke tubers. It is proposed that both K. marxianus and S. cerevisiae have considerable potential in ethanol production from Jerusalem artichoke tubers by a high temperature CBP.

  8. Thermotolerant Kluyveromyces marxianus and Saccharomyces cerevisiae strains representing potentials for bioethanol production from Jerusalem artichoke by consolidated bioprocessing

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Nan [Agricultural Univ., Qingdao, SD (China). College of Animal Science and Technology; Chinese Academy of Sciences, Qingdao, SD (China). Key Lab. of Biofuels; Yuan, Bo; Wang, Shi-An; Li, Fu-Li [Chinese Academy of Sciences, Qingdao, SD (China). Key Lab. of Biofuels; Sun, Juan [Agricultural Univ., Qingdao, SD (China). College of Animal Science and Technology

    2012-09-15

    Thermotolerant inulin-utilizing yeast strains are desirable for ethanol production from Jerusalem artichoke tubers by consolidated bioprocessing (CBP). To obtain such strains, 21 naturally occurring yeast strains isolated by using an enrichment method and 65 previously isolated Saccharomyces cerevisiae strains were investigated in inulin utilization, extracellular inulinase activity, and ethanol fermentation from inulin and Jerusalem artichoke tuber flour at 40 C. The strains Kluyveromyces marxianus PT-1 (CGMCC AS2.4515) and S. cerevisiae JZ1C (CGMCC AS2.3878) presented the highest extracellular inulinase activity and ethanol yield in this study. The highest ethanol concentration in Jerusalem artichoke tuber flour fermentation (200 g L{sup -1}) at 40 C achieved by K. marxianus PT-1 and S. cerevisiae JZ1C was 73.6 and 65.2 g L{sup -1}, which corresponded to the theoretical ethanol yield of 90.0 and 79.7 %, respectively. In the range of 30 to 40 C, temperature did not have a significant effect on ethanol production for both strains. This study displayed the distinctive superiority of K. marxianus PT-1 and S. cerevisiae JZ1C in the thermotolerance and utilization of inulin-type oligosaccharides reserved in Jerusalem artichoke tubers. It is proposed that both K. marxianus and S. cerevisiae have considerable potential in ethanol production from Jerusalem artichoke tubers by a high temperature CBP. (orig.)

  9. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  10. Analytical Modeling of Natural Convection in a Tall Rectangular Enclosure with Multiple Disconnected Partitions

    Directory of Open Access Journals (Sweden)

    Youngmin Bae

    2016-08-01

    Full Text Available In this study, laminar natural circulation and heat transfer in a tall rectangular enclosure with disconnected vertical partitions inside were investigated. Analytical expressions were developed to predict the circulation flow rate and the average Nusselt number in a partially partitioned enclosure with isothermal side walls at different temperatures and insulated top and bottom walls. The proposed formulas are then validated against numerical results for modified Rayleigh numbers of up to 106. The impacts of the governing parameters are also examined along with a discussion of the heat transfer regimes.

  11. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  12. Analytic free-form lens design for imaging applications with high aspect ratio

    Science.gov (United States)

    Duerr, Fabian; Benítez, Pablo; Miñano, Juan Carlos; Meuret, Youri; Thienpont, Hugo

    2012-10-01

    A new three-dimensional analytic optics design method is presented that enables the coupling of three ray sets with only two free-form lens surfaces. Closely related to the Simultaneous Multiple Surface method in three dimensions (SMS3D), it is derived directly from Fermat's principle, leading to multiple sets of functional differential equations. The general solution of these equations makes it possible to calculate more than 80 coefficients for each implicit surface function. Ray tracing simulations of these free-form lenses demonstrate superior imaging performance for applications with high aspect ratio, compared to conventional rotational symmetric systems.

  13. Analytical theory of Doppler reflectometry in slab plasma model

    Energy Technology Data Exchange (ETDEWEB)

    Gusakov, E.Z.; Surkov, A.V. [Ioffe Institute, Politekhnicheskaya 26, St. Petersburg (Russian Federation)

    2004-07-01

    Doppler reflectometry is considered in slab plasma model in the frameworks of analytical theory. The diagnostics locality is analyzed for both regimes: linear and nonlinear in turbulence amplitude. The toroidal antenna focusing of probing beam to the cut-off is proposed and discussed as a method to increase diagnostics spatial resolution. It is shown that even in the case of nonlinear regime of multiple scattering, the diagnostics can be used for an estimation (with certain accuracy) of plasma poloidal rotation profile. (authors)

  14. Power-law Exponent in Multiplicative Langevin Equation with Temporally Correlated Noise

    Science.gov (United States)

    Morita, Satoru

    2018-05-01

    Power-law distributions are ubiquitous in nature. Random multiplicative processes are a basic model for the generation of power-law distributions. For discrete-time systems, the power-law exponent is known to decrease as the autocorrelation time of the multiplier increases. However, for continuous-time systems, it is not yet clear how the temporal correlation affects the power-law behavior. Herein, we analytically investigated a multiplicative Langevin equation with colored noise. We show that the power-law exponent depends on the details of the multiplicative noise, in contrast to the case of discrete-time systems.

  15. Analytical solutions of nonlocal Poisson dielectric models with multiple point charges inside a dielectric sphere

    Science.gov (United States)

    Xie, Dexuan; Volkmer, Hans W.; Ying, Jinyong

    2016-04-01

    The nonlocal dielectric approach has led to new models and solvers for predicting electrostatics of proteins (or other biomolecules), but how to validate and compare them remains a challenge. To promote such a study, in this paper, two typical nonlocal dielectric models are revisited. Their analytical solutions are then found in the expressions of simple series for a dielectric sphere containing any number of point charges. As a special case, the analytical solution of the corresponding Poisson dielectric model is also derived in simple series, which significantly improves the well known Kirkwood's double series expansion. Furthermore, a convolution of one nonlocal dielectric solution with a commonly used nonlocal kernel function is obtained, along with the reaction parts of these local and nonlocal solutions. To turn these new series solutions into a valuable research tool, they are programed as a free fortran software package, which can input point charge data directly from a protein data bank file. Consequently, different validation tests can be quickly done on different proteins. Finally, a test example for a protein with 488 atomic charges is reported to demonstrate the differences between the local and nonlocal models as well as the importance of using the reaction parts to develop local and nonlocal dielectric solvers.

  16. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    Science.gov (United States)

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient

  17. Analysis of (n, 2n) multiplication in lead

    International Nuclear Information System (INIS)

    Segev, M.

    1984-01-01

    Lead is being considered as a possible amplifier of neutrons for fusion blankets. A simple one-group model of neutron multiplications in Pb is presented. Given the 14 MeV neutron cross section on Pb, the model predicts the multiplication. Given measured multiplications, the model enables the determination of the (n, 2n) and transport cross sections. Required for the model are: P-the collision probability for source neutrons in the Pb body-and W- an average collision probability for non-virgin, non-degraded neutrons. In simple geometries, such as a source in the center of a spherical shell, P and an approximate W can be expressed analytically in terms of shell dimensions and the Pb transport cross section. The model was applied to Takahashi's measured multiplications in Pb shells in order to understand the apparent very high multiplicative power of Pb. The results of the analysis are not consistent with basic energy-balance and cross section magnitude constraints in neutron interaction theory. (author)

  18. Analytical solution for multi-species contaminant transport in finite media with time-varying boundary conditions

    Science.gov (United States)

    Most analytical solutions available for the equations governing the advective-dispersive transport of multiple solutes undergoing sequential first-order decay reactions have been developed for infinite or semi-infinite spatial domains and steady-state boundary conditions. In this work we present an ...

  19. Accurate Analytical Multiple-Access Performance of Time-Hopping Biorthogonal PPM IR-UWB Systems

    Directory of Open Access Journals (Sweden)

    SVEDEK, T.

    2011-05-01

    Full Text Available In this paper, the characteristic function (CF method is used to derive the symbol error rate (SER expression for time-hopping impulse radio ultra-wideband (TH-IR-UWB systems with a biorthogonal pulse position modulation (BPPM scheme in the presence of a multi-user interference (MUI. The derived expression is validated with the Monte-Carlo simulation and compared with orthogonal PPM. Moreover, the analytical results are compared with the Gaussian approximation (GA of MUI which is shown to be inaccurate for a medium and large signal-to-noise ratio (SNR. It is also shown that the BPPM scheme outperforms the PPM scheme for all SNR. At the end, the influence of different system parameters on the BPPM performance is analyzed.

  20. Canonical correlation analysis of multiple sensory directed metabolomics data blocks reveals corresponding parts between data blocks.

    NARCIS (Netherlands)

    Doeswijk, T. G.; Hageman, J.A.; Westerhuis, J.A.; Tikunov, Y.; Bovy, A.; van Eeuwijk, F.A.

    2011-01-01

    Multiple analytical platforms are frequently used in metabolomics studies. The resulting multiple data blocks contain, in general, similar parts of information which can be disclosed by chemometric methods. The metabolites of interest, however, are usually just a minor part of the complete data

  1. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    Science.gov (United States)

    Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen

    2016-01-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  2. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    Science.gov (United States)

    Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.

    2016-12-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  3. Three lessons for genetic toxicology from baseball analytics.

    Science.gov (United States)

    Dertinger, Stephen D

    2017-07-01

    In many respects the evolution of baseball statistics mirrors advances made in the field of genetic toxicology. From its inception, baseball and statistics have been inextricably linked. Generations of players and fans have used a number of relatively simple measurements to describe team and individual player's current performance, as well as for historical record-keeping purposes. Over the years, baseball analytics has progressed in several important ways. Early advances were based on deriving more meaningful metrics from simpler forerunners. Now, technological innovations are delivering much deeper insights. Videography, radar, and other advances that include automatic player recognition capabilities provide the means to measure more complex and useful factors. Fielders' reaction times, efficiency of the route taken to reach a batted ball, and pitch-framing effectiveness come to mind. With the current availability of complex measurements from multiple data streams, multifactorial analyses occurring via machine learning algorithms have become necessary to make sense of the terabytes of data that are now being captured in every Major League Baseball game. Collectively, these advances have transformed baseball statistics from being largely descriptive in nature to serving data-driven, predictive roles. Whereas genetic toxicology has charted a somewhat parallel course, a case can be made that greater utilization of baseball's mindset and strategies would serve our scientific field well. This paper describes three useful lessons for genetic toxicology, courtesy of the field of baseball analytics: seek objective knowledge; incorporate multiple data streams; and embrace machine learning. Environ. Mol. Mutagen. 58:390-397, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. An analytical turn-on power loss model for 650-V GaN eHEMTs

    DEFF Research Database (Denmark)

    Shen, Yanfeng; Wang, Huai; Shen, Zhan

    2018-01-01

    This paper proposes an improved analytical turn-on power loss model for 650-V GaN eHEMTs. The static characteristics, i.e., the parasitic capacitances and transconductance, are firstly modeled. Then the turn-on process is divided into multiple stages and analyzed in detail; as results, the time-d...

  5. Multiple equilibria of divertor plasmas

    International Nuclear Information System (INIS)

    Vu, H.X.; Prinja, A.K.

    1993-01-01

    A one-dimensional, two-fluid transport model with a temperature-dependent neutral recycling coefficient is shown to give rise to multiple equilibria of divertor plasmas (bifurcation). Numerical techniques for obtaining these multiple equilibria and for examining their stability are presented. Although these numerical techniques have been well known to the scientific community, this is the first time they have been applied to divertor plasma modeling to show the existence of multiple equilibria as well as the stability of these solutions. Numerical and approximate analytical solutions of the present one-dimensional transport model both indicate that there exists three steady-state solutions corresponding to (1) a high-temperature, low-density equilibrium, (2) a low-temperature, high-density equilibrium, and (3) an intermediate-temperature equilibrium. While both the low-temperature and the high-temperature equilibria are stable, with respect to small perturbations in the plasma conditions, the intermediate-temperature equilibrium is physically unstable, i.e., any small perturbation about this equilibrium will cause a transition toward either the high-temperature or low-temperature equilibrium

  6. Metaphor, Multiplicative Meaning and the Semiotic Construction of Scientific Knowledge

    Science.gov (United States)

    Liu, Yu; Owyong, Yuet See Monica

    2011-01-01

    Scientific discourse is characterized by multi-semiotic construction and the resultant semantic expansions. To date, there remains a lack of analytical methods to explicate the multiplicative nature of meaning. Drawing on the theories of systemic functional linguistics, this article examines the meaning-making processes across language and…

  7. 3D-MICE: integration of cross-sectional and longitudinal imputation for multi-analyte longitudinal clinical data.

    Science.gov (United States)

    Luo, Yuan; Szolovits, Peter; Dighe, Anand S; Baron, Jason M

    2018-06-01

    A key challenge in clinical data mining is that most clinical datasets contain missing data. Since many commonly used machine learning algorithms require complete datasets (no missing data), clinical analytic approaches often entail an imputation procedure to "fill in" missing data. However, although most clinical datasets contain a temporal component, most commonly used imputation methods do not adequately accommodate longitudinal time-based data. We sought to develop a new imputation algorithm, 3-dimensional multiple imputation with chained equations (3D-MICE), that can perform accurate imputation of missing clinical time series data. We extracted clinical laboratory test results for 13 commonly measured analytes (clinical laboratory tests). We imputed missing test results for the 13 analytes using 3 imputation methods: multiple imputation with chained equations (MICE), Gaussian process (GP), and 3D-MICE. 3D-MICE utilizes both MICE and GP imputation to integrate cross-sectional and longitudinal information. To evaluate imputation method performance, we randomly masked selected test results and imputed these masked results alongside results missing from our original data. We compared predicted results to measured results for masked data points. 3D-MICE performed significantly better than MICE and GP-based imputation in a composite of all 13 analytes, predicting missing results with a normalized root-mean-square error of 0.342, compared to 0.373 for MICE alone and 0.358 for GP alone. 3D-MICE offers a novel and practical approach to imputing clinical laboratory time series data. 3D-MICE may provide an additional tool for use as a foundation in clinical predictive analytics and intelligent clinical decision support.

  8. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  9. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    Science.gov (United States)

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  10. Quality Measures in Pre-Analytical Phase of Tissue Processing: Understanding Its Value in Histopathology.

    Science.gov (United States)

    Rao, Shalinee; Masilamani, Suresh; Sundaram, Sandhya; Duvuru, Prathiba; Swaminathan, Rajendiran

    2016-01-01

    Quality monitoring in histopathology unit is categorized into three phases, pre-analytical, analytical and post-analytical, to cover various steps in the entire test cycle. Review of literature on quality evaluation studies pertaining to histopathology revealed that earlier reports were mainly focused on analytical aspects with limited studies on assessment of pre-analytical phase. Pre-analytical phase encompasses several processing steps and handling of specimen/sample by multiple individuals, thus allowing enough scope for errors. Due to its critical nature and limited studies in the past to assess quality in pre-analytical phase, it deserves more attention. This study was undertaken to analyse and assess the quality parameters in pre-analytical phase in a histopathology laboratory. This was a retrospective study done on pre-analytical parameters in histopathology laboratory of a tertiary care centre on 18,626 tissue specimens received in 34 months. Registers and records were checked for efficiency and errors for pre-analytical quality variables: specimen identification, specimen in appropriate fixatives, lost specimens, daily internal quality control performance on staining, performance in inter-laboratory quality assessment program {External quality assurance program (EQAS)} and evaluation of internal non-conformities (NC) for other errors. The study revealed incorrect specimen labelling in 0.04%, 0.01% and 0.01% in 2007, 2008 and 2009 respectively. About 0.04%, 0.07% and 0.18% specimens were not sent in fixatives in 2007, 2008 and 2009 respectively. There was no incidence of specimen lost. A total of 113 non-conformities were identified out of which 92.9% belonged to the pre-analytical phase. The predominant NC (any deviation from normal standard which may generate an error and result in compromising with quality standards) identified was wrong labelling of slides. Performance in EQAS for pre-analytical phase was satisfactory in 6 of 9 cycles. A low incidence

  11. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  12. Computer simulation of FT-NMR multiple pulse experiment

    Science.gov (United States)

    Allouche, A.; Pouzard, G.

    1989-04-01

    Using the product operator formalism in its real form, SIMULDENS expands the density matrix of a scalar coupled nuclear spin system and simulates analytically a large variety of FT-NMR multiple pulse experiments. The observable transverse magnetizations are stored and can be combined to represent signal accumulation. The programming language is VAX PASCAL, but a MacIntosh Turbo Pascal Version is also available.

  13. Improving the signal visibility of optical-disk-drive sensors by analyte patterning and frequency-domain analysis

    International Nuclear Information System (INIS)

    Schaefer, S; Chau, K J

    2011-01-01

    One limitation of using compact disks (CDs) and optical disk drives for sensing and imaging of analytes placed on a CD is the fluctuations in the voltage signal from the disk drive generated while reading the data on the CD. In this study, we develop a simple, low-cost strategy for sensing and identification using CDs and optical disk drives that spectrally separates contributions to the voltage signal caused by an analyte intentionally placed onto the CD and that caused by the underlying data on the CD. Analytes are printed onto a CD surface with fixed spatial periodicity. As the laser beam in an optical disk drive scans over the section of the CD containing the analyte pattern, the intensity of the laser beam incident onto the photodiode integrated into the disk drive is modulated at a frequency dependent on the spatial periodicity of the analyte pattern and the speed of the optical-disk-drive motor. Fourier transformation of the voltage signal from the optical disk drive yields peaks in the frequency spectrum with amplitudes and locations that enable analyte sensing and identification, respectively. We study the influence of analyte area coverage, pattern periodicity, and CD rotational frequency on the peaks in the frequency spectrum associated with the patterned analyte. We apply this technique to discriminate differently-colored analytes, perform trigger-free detection of multiple analytes distributed on a single CD, and detect at least two different, overlapped analyte patterns on a single CD. The extension of this technique for sensing and identification of colorimetric chemical reagents is discussed

  14. Interstitial integrals in the multiple-scattering model

    International Nuclear Information System (INIS)

    Swanson, J.R.; Dill, D.

    1982-01-01

    We present an efficient method for the evaluation of integrals involving multiple-scattering wave functions over the interstitial region. Transformation of the multicenter interstitial wave functions to a single center representation followed by a geometric projection reduces the integrals to products of analytic angular integrals and numerical radial integrals. The projection function, which has the value 1 in the interstitial region and 0 elsewhere, has a closed-form partial-wave expansion. The method is tested by comparing its results with exact normalization and dipole integrals; the differences are 2% at worst and typically less than 1%. By providing an efficient means of calculating Coulomb integrals, the method allows treatment of electron correlations using a multiple scattering basis set

  15. Global Connections: Multiple Modernities and Postsecular Societies

    DEFF Research Database (Denmark)

    Thomassen, Bjørn

    2013-01-01

    For some time now, the concept of multiple modernities has been a key paradigm in the social and political sciences, not least via the work of Shmuel Eisenstadt. More recently, the notion of ‘postsecularity’ has likewise gained terrain, championed by a whole series of flagship figures, including...... of course Jürgen Habermas. This edited volume brings together these two crucial debates. It does so by, first, identifying and engaging with a series of analytical dimensions pertaining to the post-secularity/modernity nexus, programmatically outlined in the introduction by the two editors, Massimo Rosati...... and Kristina Stoeckl, and, second, by following through with the multiple modernities/post-secularity discussion in the country-based case studies that follow....

  16. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  17. The perceptions of the meaning and value of analytics in New Zealand higher education institutions

    Directory of Open Access Journals (Sweden)

    Hamidreza Mahroeian

    2017-10-01

    Full Text Available Abstract This article presents the current perceptions on the value of analytics and their possible contribution to the higher education sector in New Zealand. Seven out of eight research-intensive public universities in New Zealand took part in the study. Participants included senior management and those who have some role associated with decision-making within higher education (N = 82. The study found inconsistent understanding of the meaning of analytics across participants. In particular, three forms of perceptions of analytics were identified: structural; functional and structural-functional. It was evident that some participants viewed analytics in its structural elements such as statistics, metrics, trends, numbers, graph, and any relevant information/data to enhance better decision-making, whereas other participants perceived the notion of analytics in terms of functional aspect; as means to an end, a process to use the data to gain insights and taking action on complex problems, yet a third group viewed analytics from both structural-functional perspectives. These kinds of perceptions have to a larger extent influenced participants’ views on the value of analytics in shaping policy and practice. Also, literature has addressed a number of possible challenges associated with the large-scale institutional implementation of analytics. These challenges were: difficulties in extracting data from multiple databases, maintaining data quality, ethical and privacy issues, and lack of professional development opportunities. This article aims to broadly contribute to a better understanding of current perception and value of analytics in higher education, and in particular within the New Zealand context.

  18. Aptamer/quantum dot-based simultaneous electrochemical detection of multiple small molecules

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Haixia [Key Laboratory on Luminescence and Real-Time Analysis, Ministry of Education, School of Chemistry and Chemical Engineering, Southwest University, Chongqing 400715 (China); Jiang Bingying [School of Chemistry and Chemical Engineering, Chongqing University of Technology, Chongqing 400040 (China); Xiang Yun, E-mail: yunatswu@swu.edu.cn [Key Laboratory on Luminescence and Real-Time Analysis, Ministry of Education, School of Chemistry and Chemical Engineering, Southwest University, Chongqing 400715 (China); Zhang Yuyong; Chai Yaqin [Key Laboratory on Luminescence and Real-Time Analysis, Ministry of Education, School of Chemistry and Chemical Engineering, Southwest University, Chongqing 400715 (China); Yuan Ruo, E-mail: yuanruo@swu.edu.cn [Key Laboratory on Luminescence and Real-Time Analysis, Ministry of Education, School of Chemistry and Chemical Engineering, Southwest University, Chongqing 400715 (China)

    2011-03-04

    A novel strategy for 'signal on' and sensitive one-spot simultaneous detection of multiple small molecular analytes based on electrochemically encoded barcode quantum dot (QD) tags is described. The target analytes, adenosine triphosphate (ATP) and cocaine, respectively, are sandwiched between the corresponding set of surface-immobilized primary binding aptamers and the secondary binding aptamer/QD bioconjugates. The captured QDs yield distinct electrochemical signatures after acid dissolution, whose position and size reflect the identity and level, respectively, of the corresponding target analytes. Due to the inherent amplification feature of the QD labels and the 'signal on' detection scheme, as well as the sensitive monitoring of the metal ions released upon acid dissolution of the QD labels, low detection limits of 30 nM and 50 nM were obtained for ATP and cocaine, respectively, in our assays. Our multi-analyte sensing system also shows high specificity to target analytes and promising applicability to complex sample matrix, which makes the proposed assay protocol an attractive route for screening of small molecules in clinical diagnosis.

  19. On application of analytical transformation system using a computer for Feynman intearal calculation

    International Nuclear Information System (INIS)

    Gerdt, V.P.

    1978-01-01

    Various systems of analytic transformations for the calculation of Feynman integrals using computers are discussed. The hyperspheric technique Which is used to calculate Feynman integrals enables to perform angular integration for a set of diagrams, thus reducing the multiplicity of integral. All calculations based on this method are made with the ASHMEDAL program. Feynman integrals are calculated in Euclidean space using integration by parts and some differential identities. Analytic calculation of Feynman integral is performed by the MACSYMA system. Dispersion method of integral calculation is implemented in the SCHOONSCHIP system, calculations based on features of Nielsen function are made using efficient SINAC and RSIN programs. A tube of basic Feynman integral parameters calculated using the above techniques is given

  20. A Novel Analytic Technique for the Service Station Reliability in a Discrete-Time Repairable Queue

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2013-01-01

    Full Text Available This paper presents a decomposition technique for the service station reliability in a discrete-time repairable GeomX/G/1 queueing system, in which the server takes exhaustive service and multiple adaptive delayed vacation discipline. Using such a novel analytic technique, some important reliability indices and reliability relation equations of the service station are derived. Furthermore, the structures of the service station indices are also found. Finally, special cases and numerical examples validate the derived results and show that our analytic technique is applicable to reliability analysis of some complex discrete-time repairable bulk arrival queueing systems.

  1. 20050411 NATO Advanced Study Institute on Dynamics of Complex Interconnected Biosensor Systems: Networks and Bioprocesses Geilo, Norway 11 - 21 Apr 2005 2005 geilo20050411 NO 20050421

    CERN Document Server

    Skjeltorp, Arne T

    2006-01-01

    The book reviews the synergism between various fields of research that are confronted with networks, such as genetic and metabolic networks, social networks, the Internet and ecological systems. In many cases, the interacting networks manifest so-called emergent properties that are not possessed by any of the individual components. This means that the detailed knowledge of the components is insufficient to describe the whole system. Recent work has indicated that networks in nature have so-called scale-free characteristics, and the associated dynamic network modelling shows unexpected results such as an amazing robustness against accidental failures. Modelling the signal transduction networks in bioprocesses as in living cells is a challenging interdisciplinary research area. It is now realized that the many features of molecular interaction networks within a cell are shared to a large degree by the other complex systems mentioned above, such as the Internet, computer chips and society. Thus knowledge gained ...

  2. Sourcing and bioprocessing of brown seaweed for maximizing glucose release

    DEFF Research Database (Denmark)

    Manns, Dirk Martin

    maximum levels of glucose. The first requirement was to develop a robust methodology, including acid hydrolysis and analytical composition analysis, to quantitatively estimate the carbohydrate composition of the brown seaweeds. The monosaccharide composition of four different samples of brown seaweeds...... with lower enzyme loading. Simple application of only the cellulase preparation enabled the release of only half of the present glucose after 8 h. Analysis after the enzymatic treatment indicated a potential extraction of proteins from the solid residue and the sulfated polysaccharide fucoidan solubilized...

  3. Suppression of Growth by Multiplicative White Noise in a Parametric Resonant System

    Science.gov (United States)

    Ishihara, Masamichi

    2015-02-01

    The growth of the amplitude in a Mathieu-like equation with multiplicative white noise is studied. To obtain an approximate analytical expression for the exponent at the extremum on parametric resonance regions, a time-interval width is introduced. To determine the exponents numerically, the stochastic differential equations are solved by a symplectic numerical method. The Mathieu-like equation contains a parameter α determined by the intensity of noise and the strength of the coupling between the variable and noise; without loss of generality, only non-negative α can be considered. The exponent is shown to decrease with α, reach a minimum and increase after that. The minimum exponent is obtained analytically and numerically. As a function of α, the minimum at α≠0, occurs on the parametric resonance regions of α=0. This minimum indicates suppression of growth by multiplicative white noise.

  4. A European multicenter study on the analytical performance of the VERIS HBV assay.

    Science.gov (United States)

    Braun, Patrick; Delgado, Rafael; Drago, Monica; Fanti, Diana; Fleury, Hervé; Izopet, Jacques; Lombardi, Alessandra; Mancon, Alessandro; Marcos, Maria Angeles; Sauné, Karine; O Shea, Siobhan; Pérez-Rivilla, Alfredo; Ramble, John; Trimoulet, Pascale; Vila, Jordi; Whittaker, Duncan; Artus, Alain; Rhodes, Daniel

    Hepatitis B viral load monitoring is an essential part of managing patients with chronic Hepatits B infection. Beckman Coulter has developed the VERIS HBV Assay for use on the fully automated Beckman Coulter DxN VERIS Molecular Diagnostics System. 1 OBJECTIVES: To evaluate the analytical performance of the VERIS HBV Assay at multiple European virology laboratories. Precision, analytical sensitivity, negative sample performance, linearity and performance with major HBV genotypes/subtypes for the VERIS HBV Assay was evaluated. Precision showed an SD of 0.15 log 10 IU/mL or less for each level tested. Analytical sensitivity determined by probit analysis was between 6.8-8.0 IU/mL. Clinical specificity on 90 unique patient samples was 100.0%. Performance with 754 negative samples demonstrated 100.0% not detected results, and a carryover study showed no cross contamination. Linearity using clinical samples was shown from 1.23-8.23 log 10 IU/mL and the assay detected and showed linearity with major HBV genotypes/subtypes. The VERIS HBV Assay demonstrated comparable analytical performance to other currently marketed assays for HBV DNA monitoring. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    Science.gov (United States)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  6. Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.

    Science.gov (United States)

    Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim

    2016-04-01

    Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.

  7. PLE in the analysis of plant compounds. Part II: One-cycle PLE in determining total amount of analyte in plant material.

    Science.gov (United States)

    Dawidowicz, Andrzej L; Wianowska, Dorota

    2005-04-29

    Pressurised liquid extraction (PLE) is recognised as one of the most effective sample preparation methods. Despite the enhanced extraction power of PLE, the full recovery of an analyte from plant material may require multiple extractions of the same sample. The presented investigations show the possibility of estimating the true concentration value of an analyte in plant material employing one-cycle PLE in which plant samples of different weight are used. The performed experiments show a linear dependence between the reciprocal value of the analyte amount (E*), extracted in single-step PLE from a plant matrix, and the ratio of plant material mass to extrahent volume (m(p)/V(s)). Hence, time-consuming multi-step PLE can be replaced by a few single-step PLEs performed at different (m(p)/V(s)) ratios. The concentrations of rutin in Sambucus nigra L. and caffeine in tea and coffee estimated by means of the tested procedure are almost the same as their concentrations estimated by multiple PLE.

  8. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  9. eAnalytics: Dynamic Web-based Analytics for the Energy Industry

    Directory of Open Access Journals (Sweden)

    Paul Govan

    2016-11-01

    Full Text Available eAnalytics is a web application built on top of R that provides dynamic data analytics to energy industry stakeholders. The application allows users to dynamically manipulate chart data and style through the Shiny package’s reactive framework. eAnalytics currently supports a number of features including interactive datatables, dynamic charting capabilities, and the ability to save, download, or export information for further use. Going forward, the goal for this project is that it will serve as a research hub for discovering new relationships in the data. The application is illustrated with a simple tutorial of the user interface design.

  10. Computer simulation of FT-NMR multiple pulse experiment

    International Nuclear Information System (INIS)

    Allouche, A.; Pouzard, G.

    1989-01-01

    Using the product operator formalism in its real form, SIMULDENS expands the density matrix of a scalar coupled nuclear spin system and simulates analytically a large variety of FT-NMR multiple pulse experiments. The observable transverse magnetizations are stored and can be combined to represent signal accumulation. The programming language is VAX PASCAL, but a MacIntosh Turbo Pascal Version is also available. (orig.)

  11. Extending Climate Analytics as a Service to the Earth System Grid Federation Progress Report on the Reanalysis Ensemble Service

    Science.gov (United States)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2016-12-01

    We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons

  12. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA for Environmental Risk Management

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-12-01

    Full Text Available With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity and the degree of Socio-Economic Deprivation (SED at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  13. A review of simple multiple criteria decision making analytic procedures which are implementable on spreadsheet packages

    Directory of Open Access Journals (Sweden)

    T.J. Stewart

    2003-12-01

    Full Text Available A number of modern multi-criteria decision making aids for the discrete choice problem, are reviewed, with particular emphasis on those which can be implemented on standard commercial spreadsheet packages. Three broad classes of procedures are discussed, namely the analytic hierarchy process, reference point methods, and outranking methods. The broad principles are summarised in a consistent framework, and on a spreadsheet. LOTUS spreadsheets implementing these are available from the author.

  14. A novel analytical description of periodic volume coil geometries in MRI

    Science.gov (United States)

    Koh, D.; Felder, J.; Shah, N. J.

    2018-03-01

    MRI volume coils can be represented by equivalent lumped element circuits and for a variety of these circuit configurations analytical design equations have been presented. The unification of several volume coil topologies results in a two-dimensional gridded equivalent lumped element circuit which compromises the birdcage resonator, its multiple endring derivative but also novel structures like the capacitive coupled ring resonator. The theory section analyzes a general two-dimensional circuit by noting that its current distribution can be decomposed into a longitudinal and an azimuthal dependency. This can be exploited to compare the current distribution with a transfer function of filter circuits along one direction. The resonances of the transfer function coincide with the resonance of the volume resonator and the simple analytical solution can be used as a design equation. The proposed framework is verified experimentally against a novel capacitive coupled ring structure which was derived from the general circuit formulation and is proven to exhibit a dominant homogeneous mode. In conclusion, a unified analytical framework is presented that allows determining the resonance frequency of any volume resonator that can be represented by a two dimensional meshed equivalent circuit.

  15. AmO2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO2 Production

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, Kevin John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bland, Galey Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fulwyler, James Brent [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Garduno, Katherine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Keller, Russell C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Klundt, Dylan James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Elmer J. W [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mark, Zoe Francoise Elise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mathew, Kattathu Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ortega, Laura Claire [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ottenfeld, Chelsea Faith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porterfield, Donivan R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rearick, Michael Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rim, Jung Ho [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Schake, Ann Rene [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Schappert, Michael Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stanley, Floyd E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Thomas, Mariam R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wylie, Ernest Miller II [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Xu, Ning [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tandon, Lav [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-08

    Americium oxide samples will be measured for various analytes to support AmO2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO2 reference materials currently available for requested analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO2 production operations.

  16. Log-Normal Distribution in a Growing System with Weighted and Multiplicatively Interacting Particles

    Science.gov (United States)

    Fujihara, Akihiro; Tanimoto, Satoshi; Yamamoto, Hiroshi; Ohtsuki, Toshiya

    2018-03-01

    A growing system with weighted and multiplicatively interacting particles is investigated. Each particle has a quantity that changes multiplicatively after a binary interaction, with its growth rate controlled by a weight parameter in a homogeneous symmetric kernel. We consider the system using moment inequalities and analytically derive the log-normal-type tail in the probability distribution function of quantities when the parameter is negative, which is different from the result for single-body multiplicative processes. We also find that the system approaches a winner-take-all state when the parameter is positive.

  17. Characterization of dilation-analytic operators

    Energy Technology Data Exchange (ETDEWEB)

    Balslev, E; Grossmann, A; Paul, T

    1986-01-01

    Dilation analytic vectors and operators are characterized in a new representation of quantum mechanical states through functions analytic on the upper half-plane. In this space H/sub o/-bounded operators are integral operators and criteria for dilation analyticity are given in terms of analytic continuation outside of the half-plane for functions and for kernels. A sufficient condition is given for an integral operator in momentum space to be dilation-analytic.

  18. Biosimilars advancements: Moving on to the future.

    Science.gov (United States)

    Tsuruta, Lilian Rumi; Lopes dos Santos, Mariana; Moro, Ana Maria

    2015-01-01

    Many patents for the first biologicals derived from recombinant technology and, more recently, monoclonal antibodies (mAbs) are expiring. Naturally, biosimilars are becoming an increasingly important area of interest for the pharmaceutical industry worldwide, not only for emergent countries that need to import biologic products. This review shows the evolution of biosimilar development regarding regulatory, manufacturing bioprocess, comparability, and marketing. The regulatory landscape is evolving globally, whereas analytical structure and functional analyses provide the foundation of a biosimilar development program. The challenges to develop and demonstrate biosimilarity should overcome the inherent differences in the bioprocess manufacturing and physicochemical and biological characterization of a biosimilar compared to several lots of the reference product. The implementation of approaches, such as Quality by Design (QbD), will provide products with defined specifications in relation to quality, purity, safety, and efficacy that were not possible when the reference product was developed. Actually, the need to prove comparability to the reference product by the biosimilar industry has increased the knowledge about the product and the production-process associated by the use of powerful analytical tools. The technological challenges to make copies of biologic products while attending regulatory and market demands are expected to help innovation in the direction of attaining more productive manufacturing processes. © 2015 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  19. A real-time data acquisition and processing system for the analytical laboratory automation of a HTR spent fuel reprocessing facility

    International Nuclear Information System (INIS)

    Watzlawik, K.H.

    1979-12-01

    A real-time data acquisition and processing system for the analytical laboratory of an experimental HTR spent fuel reprocessing facility is presented. The on-line open-loop system combines in-line and off-line analytical measurement procedures including data acquisition and evaluation as well as analytical laboratory organisation under the control of a computer-supported laboratory automation system. In-line measurements are performed for density, volume and temperature in process tanks and registration of samples for off-line measurements. Off-line computer-coupled experiments are potentiometric titration, gas chromatography and X-ray fluorescence analysis. Organisational sections like sample registration, magazining, distribution and identification, multiple data assignment and especially calibrations of analytical devices are performed by the data processing system. (orig.) [de

  20. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    Science.gov (United States)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  2. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  3. Mediation Analysis with Multiple Mediators.

    Science.gov (United States)

    VanderWeele, T J; Vansteelandt, S

    2014-01-01

    Recent advances in the causal inference literature on mediation have extended traditional approaches to direct and indirect effects to settings that allow for interactions and non-linearities. In this paper, these approaches from causal inference are further extended to settings in which multiple mediators may be of interest. Two analytic approaches, one based on regression and one based on weighting are proposed to estimate the effect mediated through multiple mediators and the effects through other pathways. The approaches proposed here accommodate exposure-mediator interactions and, to a certain extent, mediator-mediator interactions as well. The methods handle binary or continuous mediators and binary, continuous or count outcomes. When the mediators affect one another, the strategy of trying to assess direct and indirect effects one mediator at a time will in general fail; the approach given in this paper can still be used. A characterization is moreover given as to when the sum of the mediated effects for multiple mediators considered separately will be equal to the mediated effect of all of the mediators considered jointly. The approach proposed in this paper is robust to unmeasured common causes of two or more mediators.

  4. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    Full Text Available Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA. Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  5. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    Science.gov (United States)

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  6. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    Science.gov (United States)

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  7. On Multiple Appearances

    DEFF Research Database (Denmark)

    Bork Petersen, Franziska

    2012-01-01

    reduction and epoché to focus on how dancing bodies appear in a stage context. To test these tools’ ability to explore dancing bodies from a third-person perspective, I analyse the Danish choreographer Kitt Johnson’s solo performance Drift (2011) - focussing on her shifting physical appearance. While...... phenomenology helps me to describe the multiple and radically different guises that Johnson assumes in her piece, my analysis, ultimately, does not aim to distil a truer, more real being from her appearances as is often the case in phenomenological philosophy. I complement my analytical approach...... with the Deleuzian notion of becoming animal and suggest that Johnson stages what could, in Judith Butler’s terms, be called a critical contingency of bodily appearance....

  8. On the performance of dual-hop systems with multiple antennas: Effects of spatial correlation, keyhole, and co-channel interference

    KAUST Repository

    Yang, Liang

    2012-12-01

    In this paper, taking into account realistic propagation conditions, namely, spatial correlation, keyhole channels, and unequal-power co-channel interference, we investigate the performance of a wireless relay network where all the nodes are equipped with multiple antennas. Considering channel state information assisted amplify-and-forward protocol, we present analytical expressions for the symbol error rate (SER) and outage probability. More specifically, we first derive the SER expressions of a relay system with orthogonal space-time block coding (OSTBC) over correlated/keyhole fading channels. We also analyze the outage probability of interference corrupted relay systems with maximal ratio combing (MRC) at the receiver as well as multiple-input multiple-output MRC (MIMO MRC). Numerical results are given to illustrate and verify the analytical results. © 2012 IEEE.

  9. On the performance of dual-hop systems with multiple antennas: Effects of spatial correlation, keyhole, and co-channel interference

    KAUST Repository

    Yang, Liang; Alouini, Mohamed-Slim; Qaraqe, Khalid A.; Liu, Weiping

    2012-01-01

    In this paper, taking into account realistic propagation conditions, namely, spatial correlation, keyhole channels, and unequal-power co-channel interference, we investigate the performance of a wireless relay network where all the nodes are equipped with multiple antennas. Considering channel state information assisted amplify-and-forward protocol, we present analytical expressions for the symbol error rate (SER) and outage probability. More specifically, we first derive the SER expressions of a relay system with orthogonal space-time block coding (OSTBC) over correlated/keyhole fading channels. We also analyze the outage probability of interference corrupted relay systems with maximal ratio combing (MRC) at the receiver as well as multiple-input multiple-output MRC (MIMO MRC). Numerical results are given to illustrate and verify the analytical results. © 2012 IEEE.

  10. Cantilever piezoelectric energy harvester with multiple cavities

    International Nuclear Information System (INIS)

    S Srinivasulu Raju; M Umapathy; G Uma

    2015-01-01

    Energy harvesting employing piezoelectric materials in mechanical structures such as cantilever beams, plates, diaphragms, etc, has been an emerging area of research in recent years. The research in this area is also focused on structural tailoring to improve the harvested power from the energy harvesters. Towards this aim, this paper presents a method for improving the harvested power from a cantilever piezoelectric energy harvester by introducing multiple rectangular cavities. A generalized model for a piezoelectric energy harvester with multiple rectangular cavities at a single section and two sections is developed. A method is suggested to optimize the thickness of the cavities and the number of cavities required to generate a higher output voltage for a given cantilever beam structure. The performance of the optimized energy harvesters is evaluated analytically and through experimentation. The simulation and experimental results show that the performance of the energy harvester can be increased with multiple cavities compared to the harvester with a single cavity. (paper)

  11. Process analytical technology (PAT) in insect and mammalian cell culture processes: dielectric spectroscopy and focused beam reflectance measurement (FBRM).

    Science.gov (United States)

    Druzinec, Damir; Weiss, Katja; Elseberg, Christiane; Salzig, Denise; Kraume, Matthias; Pörtner, Ralf; Czermak, Peter

    2014-01-01

    Modern bioprocesses demand for a careful definition of the critical process parameters (CPPs) already during the early stages of process development in order to ensure high-quality products and satisfactory yields. In this context, online monitoring tools can be applied to recognize unfavorable changes of CPPs during the production processes and to allow for early interventions in order to prevent losses of production batches due to quality issues. Process analytical technologies such as the dielectric spectroscopy or focused beam reflectance measurement (FBRM) are possible online monitoring tools, which can be applied to monitor cell growth as well as morphological changes. Since the dielectric spectroscopy only captures cells with intact cell membranes, even information about dead cells with ruptured or leaking cell membranes can be derived. The following chapter describes the application of dielectric spectroscopy on various virus-infected and non-infected cell lines with respect to adherent as well as suspension cultures in common stirred tank reactors. The adherent mammalian cell lines Vero (African green monkey kidney cells) and hMSC-TERT (telomerase-immortalized human mesenchymal stem cells) are thereby cultured on microcarrier, which provide the required growth surface and allow the cultivation of these cells even in dynamic culture systems. In turn, the insect-derived cell lines S2 and Sf21 are used as examples for cells typically cultured in suspension. Moreover, the FBRM technology as a further monitoring tool for cell culture applications has been included in this chapter using the example of Drosophila S2 insect cells.

  12. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  13. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  14. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  15. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  16. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    Science.gov (United States)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  17. Application of holographic sub-wavelength diffraction gratings for monitoring of kinetics of bioprocesses

    International Nuclear Information System (INIS)

    Tamulevičius, Tomas; Šeperys, Rimas; Andrulevičius, Mindaugas; Kopustinskas, Vitoldas; Meškinis, Šarūnas; Tamulevičius, Sigitas; Mikalayeva, Valeryia; Daugelavičius, Rimantas

    2012-01-01

    Highlights: ► Refractive index sensor based on DLC holographic sub-wavelength period grating. ► Spectroscopic analysis of polarized white light reflected from the grating. ► Control of critical wavelength shift and reflectivity changes. ► Testing of model liquid analyte materials. ► Evaluation of interaction between B. subtilis cells and lysozyme. - Abstract: In this work we present a refractive index (RI) sensor based on a sub-wavelength holographic diffraction grating. The sensor chip was fabricated by dry etching of the finely spaced (d = 428 nm) diffraction grating in SiO x doped diamond like carbon (DLC) film. It is shown that employing a fabricated sensor chip, and using the proposed method of analysis of data, one can inspect kinetics of processes in liquids occurring in the vicinity of the grating surface. The method is based on the spectral composition analysis of polarized polychromatic light reflected from the sub-wavelength diffraction grating. The RI measurement system was tested with different model liquid analytes including 25 wt.%, 50 wt.% sugar water solutions, 10 °C, 50 °C distilled water, also Gram-positive bacteria Bacillus subtilis interaction with ion-permeable channels forming antibiotic gramicidin D and a murolytic enzyme lysozyme. Analysis of the data set of specular reflection spectra enabled us to follow the kinetics of the RI changes in the analyte with millisecond resolution. Detectable changes in the effective RI were not worse than Δn = 10 −4 .

  18. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  19. Modeling Group Perceptions Using Stochastic Simulation: Scaling Issues in the Multiplicative AHP

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn; van den Honert, Robin; Salling, Kim Bang

    2016-01-01

    This paper proposes a new decision support approach for applying stochastic simulation to the multiplicative analytic hierarchy process (AHP) in order to deal with issues concerning the scale parameter. The paper suggests a new approach that captures the influence from the scale parameter by maki...

  20. Solution-based analysis of multiple analytes by a sensor array: toward the development of an electronic tongue

    Science.gov (United States)

    Savoy, Steven M.; Lavigne, John J.; Yoo, J. S.; Wright, John; Rodriguez, Marc; Goodey, Adrian; McDoniel, Bridget; McDevitt, John T.; Anslyn, Eric V.; Shear, Jason B.; Ellington, Andrew D.; Neikirk, Dean P.

    1998-12-01

    A micromachined sensor array has been developed for the rapid characterization of multi-component mixtures in aqueous media. The sensor functions in a manner analogous to that of the mammalian tongue, using an array composed of individually immobilized polystyrene-polyethylene glycol composite microspheres selectively arranged in micromachined etch cavities localized o n silicon wafers. Sensing occurs via colorimetric or fluorometric changes to indicator molecules that are covalently bound to amine termination sites on the polymeric microspheres. The hybrid micromachined structure has been interfaced directly to a charged-coupled-device that is used for the simultaneous acquisition of the optical data from the individually addressable `taste bud' elements. With the miniature sensor array, acquisition of data streams composed of red, green, and blue color patterns distinctive for the analytes in the solution are rapidly acquired. The unique combination of carefully chosen reporter molecules with water permeable microspheres allows for the simultaneous detection and quantification of a variety of analytes. The fabrication of the sensor structures and the initial colorimetric and fluorescent responses for pH, Ca+2, Ce+3, and sugar are reported. Interface to microfluidic components should also be possible, producing a complete sampling/sensing system.

  1. Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies

    DEFF Research Database (Denmark)

    Thompson, Simon; Kaptoge, Stephen; White, Ian

    2010-01-01

    Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure-risk relationships, but involves a number of analytical challenges....

  2. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples

    DEFF Research Database (Denmark)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart

    2017-01-01

    BACKGROUND: Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability...... in incurred samples. METHODS: We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration...... of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used...

  3. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  4. Analytical thermal modelling of multilayered active embedded chips into high density electronic board

    Directory of Open Access Journals (Sweden)

    Monier-Vinard Eric

    2013-01-01

    Full Text Available The recent Printed Wiring Board embedding technology is an attractive packaging alternative that allows a very high degree of miniaturization by stacking multiple layers of embedded chips. This disruptive technology will further increase the thermal management challenges by concentrating heat dissipation at the heart of the organic substrate structure. In order to allow the electronic designer to early analyze the limits of the power dissipation, depending on the embedded chip location inside the board, as well as the thermal interactions with other buried chips or surface mounted electronic components, an analytical thermal modelling approach was established. The presented work describes the comparison of the analytical model results with the numerical models of various embedded chips configurations. The thermal behaviour predictions of the analytical model, found to be within ±10% of relative error, demonstrate its relevance for modelling high density electronic board. Besides the approach promotes a practical solution to study the potential gain to conduct a part of heat flow from the components towards a set of localized cooled board pads.

  5. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  6. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  7. Analytical, numerical and experimental investigations of transverse fracture propagation from horizontal wells

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, M.M.; Hossain, M.M.; Crosby, D.G.; Rahman, M.K.; Rahman, S.S. [School of Petroleum Engineering, The University of New South Wales, 2052 Sydney (Australia)

    2002-08-01

    This paper presents results of a comprehensive study involving analytical, numerical and experimental investigations into transverse fracture propagation from horizontal wells. The propagation of transverse hydraulic fractures from horizontal wells is simulated and investigated in the laboratory using carefully designed experimental setups. Closed-form analytical theories for Mode I (opening) stress intensity factors for idealized fracture geometries are reviewed, and a boundary element-based model is used herein to investigate non-planar propagation of fractures. Using the mixed mode fracture propagation criterion of the model, a reasonable agreement is found with respect to fracture geometry, net fracture pressures and fracture propagation paths between the modeled fractures and the laboratory tested fractures. These results suggest that the propagation of multiple fractures requires higher net pressures than a single fracture, the underlying reason of which is theoretically justified on the basis of local stress distribution.

  8. Stationary Size Distributions of Growing Cells with Binary and Multiple Cell Division

    Science.gov (United States)

    Rading, M. M.; Engel, T. A.; Lipowsky, R.; Valleriani, A.

    2011-10-01

    Populations of unicellular organisms that grow under constant environmental conditions are considered theoretically. The size distribution of these cells is calculated analytically, both for the usual process of binary division, in which one mother cell produces always two daughter cells, and for the more complex process of multiple division, in which one mother cell can produce 2 n daughter cells with n=1,2,3,… . The latter mode of division is inspired by the unicellular algae Chlamydomonas reinhardtii. The uniform response of the whole population to different environmental conditions is encoded in the individual rates of growth and division of the cells. The analytical treatment of the problem is based on size-dependent rules for cell growth and stochastic transition processes for cell division. The comparison between binary and multiple division shows that these different division processes lead to qualitatively different results for the size distribution and the population growth rates.

  9. Analytical, Practical and Emotional Intelligence and Line Manager Competencies

    Directory of Open Access Journals (Sweden)

    Anna Baczyńska

    2015-12-01

    Full Text Available Purpose: The research objective was to examine to what extent line manager competencies are linked to intelligence, and more specifically, three types of intelligence: analytical (fluid, practical and emotional. Methodology: The research was carried out with line managers (N=98 who took part in 12 Assessment Centre sessions and completed tests measuring analytical, practical and emotional intelligence. The adopted hypotheses were tested using a multiple regression. In the regression model, the dependent variable was a managerial competency (management and striving for results, social skills, openness to change, problem solving, employee development and the explanatory variables were the three types of intelligence. Five models, each for a separate management competency, were tested in this way. Findings: In the study, it was hypothesized that practical intelligence relates to procedural tacit knowledge and is the strongest indicator of managerial competency. Analysis of the study results testing this hypothesis indicated that practical intelligence largely accounts for the level of competency used in managerial work (from 21% to 38%. The study findings suggest that practical intelligence is a better indicator of managerial competencies among line managers than traditionally measured IQ or emotional intelligence. Originality: This research fills an important gap in the literature on the subject, indicating the links between major contemporary selection indicators (i.e., analytical, practical and emotional intelligence and managerial competencies presented in realistic work simulations measured using the Assessment Centre process.

  10. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  11. Application of holographic sub-wavelength diffraction gratings for monitoring of kinetics of bioprocesses

    Energy Technology Data Exchange (ETDEWEB)

    Tamulevicius, Tomas, E-mail: tomas.tamulevicius@ktu.lt [Institute of Materials Science of Kaunas University of Technology, Savanoriu Ave. 271, LT-50131, Kaunas (Lithuania); Seperys, Rimas; Andrulevicius, Mindaugas; Kopustinskas, Vitoldas; Meskinis, Sarunas; Tamulevicius, Sigitas [Institute of Materials Science of Kaunas University of Technology, Savanoriu Ave. 271, LT-50131, Kaunas (Lithuania); Mikalayeva, Valeryia; Daugelavicius, Rimantas [Department of Biochemistry and Biotechnologies of Vytautas Magnus University, Vileikos St. 8, LT-44404 Kaunas (Lithuania)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer Refractive index sensor based on DLC holographic sub-wavelength period grating. Black-Right-Pointing-Pointer Spectroscopic analysis of polarized white light reflected from the grating. Black-Right-Pointing-Pointer Control of critical wavelength shift and reflectivity changes. Black-Right-Pointing-Pointer Testing of model liquid analyte materials. Black-Right-Pointing-Pointer Evaluation of interaction between B. subtilis cells and lysozyme. - Abstract: In this work we present a refractive index (RI) sensor based on a sub-wavelength holographic diffraction grating. The sensor chip was fabricated by dry etching of the finely spaced (d = 428 nm) diffraction grating in SiO{sub x} doped diamond like carbon (DLC) film. It is shown that employing a fabricated sensor chip, and using the proposed method of analysis of data, one can inspect kinetics of processes in liquids occurring in the vicinity of the grating surface. The method is based on the spectral composition analysis of polarized polychromatic light reflected from the sub-wavelength diffraction grating. The RI measurement system was tested with different model liquid analytes including 25 wt.%, 50 wt.% sugar water solutions, 10 Degree-Sign C, 50 Degree-Sign C distilled water, also Gram-positive bacteria Bacillus subtilis interaction with ion-permeable channels forming antibiotic gramicidin D and a murolytic enzyme lysozyme. Analysis of the data set of specular reflection spectra enabled us to follow the kinetics of the RI changes in the analyte with millisecond resolution. Detectable changes in the effective RI were not worse than {Delta}n = 10{sup -4}.

  12. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  13. Framework for pedagogical learning analytics

    OpenAIRE

    Heilala, Ville

    2018-01-01

    Learning analytics is an emergent technological practice and a multidisciplinary scientific discipline, which goal is to facilitate effective learning and knowledge of learning. In this design science research, I combine knowledge discovery process, a concept of pedagogical knowledge, ethics of learning analytics and microservice architecture. The result is a framework for pedagogical learning analytics. The framework is applied and evaluated in the context of agency analytics. The framework ...

  14. Comprehension of complex biological processes by analytical methods: how far can we go using mass spectrometry?

    International Nuclear Information System (INIS)

    Gerner, C.

    2013-01-01

    Comprehensive understanding of complex biological processes is the basis for many biomedical issues of great relevance for modern society including risk assessment, drug development, quality control of industrial products and many more. Screening methods provide means for investigating biological samples without research hypothesis. However, the first boom of analytical screening efforts has passed and we again need to ask whether and how to apply screening methods. Mass spectrometry is a modern tool with unrivalled analytical capacities. This applies to all relevant characteristics of analytical methods such as specificity, sensitivity, accuracy, multiplicity and diversity of applications. Indeed, mass spectrometry qualifies to deal with complexity. Chronic inflammation is a common feature of almost all relevant diseases challenging our modern society; these diseases are apparently highly diverse and include arteriosclerosis, cancer, back pain, neurodegenerative diseases, depression and other. The complexity of mechanisms regulating chronic inflammation is the reason for the practical challenge to deal with it. The presentation shall give an overview of capabilities and limitations of the application of this analytical tool to solve critical questions with great relevance for our society. (author)

  15. Oxygen transport enhancement by functionalized magnetic nanoparticles (FMP) in bioprocesses

    Science.gov (United States)

    Ataide, Filipe Andre Prata

    The enhancement of fluid properties, namely thermal conductivity and mass diffusivity for a wide range of applications, through the use of nanosized particles' suspensions has been gathering increasing interest in the scientific community. In previous studies, Olle et al. (2006) showed an enhancement in oxygen absorption to aqueous solutions of up to 6-fold through the use of functionalized nanosized magnetic particles with oleic acid coating. Krishnamurthy et al. (2006) showed a remarkable 26-fold enhancement in dye diffusion in water. These two publications are landmarks in mass transfer enhancement in chemical systems through the use of nanoparticles. The central goal of this Ph.D. thesis was to develop functionalized magnetic nanoparticles to enhance oxygen transport in bioprocesses. The experimental protocol for magnetic nanoparticles synthesis and purification adopted in this thesis is a modification of that reported by Olle et al. (2006). This is facilitated by employing twice the quantity of ammonia, added at a slower rate, and by filtering the final nanoparticle solution in a cross-flow filtration modulus against 55 volumes of distilled water. This modification in the protocol resulted in improved magnetic nanoparticles with measurably higher mass transfer enhancement. Magnetic nanoparticles with oleic acid and Hitenol-BC coating were screened for oxygen transfer enhancement, since these particles are relatively inexpensive and easy to synthesize. A glass 0.5-liter reactor was custom manufactured specifically for oxygen transport studies in magnetic nanoparticles suspensions. The reactor geometry, baffles and Rushton impeller are of standard dimensions. Mass transfer tests were conducted through the use of the sulphite oxidation method, applying iodometric back-titration. A 3-factor central composite circumscribed design (CCD) was adopted for design of experiments in order to generate sufficiently informative data to model the effect of magnetic

  16. The industrial applicability of purified cellulase complex indigenously produced by Trichoderma viride through solid-state bio-processing of agro-industrial and municipal paper wastes

    Directory of Open Access Journals (Sweden)

    Muhammad Irshad

    2013-02-01

    Full Text Available An indigenous strain of Trichoderma viride produced high titers of cellulase complex in solid-state bio-processing of agro-industrial orange peel waste, which was used as the growth-supporting substrate. When the conditions of the SSF medium containing 15 g orange peel (50% w/w moisture inoculated with 5 mL of inoculum were optimal, the maximum productions of endoglucanase (655 ± 5.5 U/mL, exoglucanase (412 ± 4.3 U/mL, and β-glucosidase (515 ± 3.7 U/mL were recorded after 4 days of incubation at pH 5 and 35 °C. The enzyme with maximum activity (endoglucanase was purified by ammonium sulfate fractionation and Sephadex G-100 column gel filtration chromatographic technique. Endoglucanase was 5.5-fold purified with specific activity of 498 U/mg in comparison to the crude enzyme. The enzyme was shown to have a molecular weight of 58 kDa by sodium dodecyl sulphate poly-acrylamide gel electrophoresis (SDS-PAGE. The shelf life profile revealed that the enzyme could be stored at room temperature (30 °C for up to 45 days without losing much of its activity.

  17. Data Intensive Architecture for Scalable Cyber Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-11-15

    events, we utilized multidimensional OLAP data cubes. The data cube structure supports interactive analysis of summary data across multiple dimensions, such as location, time, and protocol. Cube technology also allows the analyst to drill-down into the underlying data set, when events of interest are identified and detailed analysis is required. Unfortunately, when creating these cubes, we ran into significant performance issues with our initial architecture, caused by a combination of the data volume and attribute characteristics. Overcoming, these issues required us to develop a novel, data intensive computing infrastructure. In particular, we ended up combining a Netezza Twin Fin data warehouse appliance, a solid state Fusion IO ioDrive, and the Tableau Desktop business intelligence analytic software. Using this architecture, we were able to analyze a month's worth of flow records comprising 4.9B records, totaling approximately 600GB of data. This paper describes our architecture, the challenges that we encountered, and the work that remains to deploy a fully generalized cyber analytical infrastructure.

  18. Quantum correlation approach to criticality in the XX spin chain with multiple interaction

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, W.W., E-mail: weien.cheng@gmail.com [Institute of Signal Processing and Transmission, Nanjing University of Posts and Telecommunication, Nanjing 210003 (China); Department of Physics, Hubei Normal University, Huangshi 435002 (China); Key Lab of Broadband Wireless Communication and Sensor Network Technology, Ministry of Education (China); Shan, C.J. [Department of Physics, Hubei Normal University, Huangshi 435002 (China); Sheng, Y.B.; Gong, L.Y.; Zhao, S.M. [Institute of Signal Processing and Transmission, Nanjing University of Posts and Telecommunication, Nanjing 210003 (China); Key Lab of Broadband Wireless Communication and Sensor Network Technology, Ministry of Education (China)

    2012-09-01

    We investigate the quantum critical behavior in the XX spin chain with a XZY-YZX type multiple interaction by means of quantum correlation (Concurrence C, quantum discord D{sub Q} and geometric discord D{sub G}). Around the critical point, the values of these quantum correlations and corresponding derivatives are investigated numerically and analytically. The results show that the non-analyticity property of the concurrence cannot signal well the quantum phase transition, but both the quantum discord and geometric discord can characterize the critical behavior in such model exactly.

  19. Review of nonconventional bioreactor technology

    Energy Technology Data Exchange (ETDEWEB)

    Turick, C.E.; Mcllwain, M.E.

    1993-09-01

    Biotechnology will significantly affect many industrial sectors in the future. Industrial sectors that will be affected include pharmaceutical, chemical, fuel, agricultural, and environmental remediation. Future research is needed to improve bioprocessing efficiency and cost-effectiveness in order to compete with traditional technologies. This report describes recent advances in bioprocess technologies and bioreactor designs and relates them to problems encountered in many industrial bioprocessing operations. The primary focus is directed towards increasing gas and vapor transfer for enhanced bioprocess kinetics as well as unproved by-product separation and removal. The advantages and disadvantages of various conceptual designs such as hollow-fiber, gas-phase, hyperbaric/hypobaric, and electrochemical bioreactors are also discussed. Specific applications that are intended for improved bioprocesses include coal desulfurization, coal liquefaction, soil bioremediation, biomass conversion to marketable chemicals, biomining, and biohydrometallurgy as well as bioprocessing of gases and vapors.

  20. Analytical approach to landside system dynamics at airport passenger terminals: departmentalization and holistic view

    OpenAIRE

    Montesinos Ferrer, Marti

    2016-01-01

    Airport landside system is complex, with multiple interrelations. Currently, each facility is managed locally without a systemic view. This study analyzes the impact of different resource management policies on the overall system performance (embarking direction). The results are derived from an analytical approach, based on queueing theory, which allows investigating different time-varying resource allocation policies at each processing facility and its impact on system dynamics.

  1. Limitless Analytic Elements

    Science.gov (United States)

    Strack, O. D. L.

    2018-02-01

    We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.

  2. Hyphenated analytical techniques for materials characterisation

    International Nuclear Information System (INIS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-01-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  3. Hyphenated analytical techniques for materials characterisation

    Science.gov (United States)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the

  4. Analytical derivation: An epistemic game for solving mathematically based physics problems

    Science.gov (United States)

    Bajracharya, Rabindra R.; Thompson, John R.

    2016-06-01

    Problem solving, which often involves multiple steps, is an integral part of physics learning and teaching. Using the perspective of the epistemic game, we documented a specific game that is commonly pursued by students while solving mathematically based physics problems: the analytical derivation game. This game involves deriving an equation through symbolic manipulations and routine mathematical operations, usually without any physical interpretation of the processes. This game often creates cognitive obstacles in students, preventing them from using alternative resources or better approaches during problem solving. We conducted hour-long, semi-structured, individual interviews with fourteen introductory physics students. Students were asked to solve four "pseudophysics" problems containing algebraic and graphical representations. The problems required the application of the fundamental theorem of calculus (FTC), which is one of the most frequently used mathematical concepts in physics problem solving. We show that the analytical derivation game is necessary, but not sufficient, to solve mathematically based physics problems, specifically those involving graphical representations.

  5. Learning Analytics: drivers, developments and challenges

    Directory of Open Access Journals (Sweden)

    Rebecca Ferguson

    2014-12-01

    Full Text Available Learning analytics is a significant area of Technology-Enhanced Learning (TEL that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.

  6. Analytical mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  7. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  8. Parallel steady state studies on a milliliter scale accelerate fed-batch bioprocess design for recombinant protein production with Escherichia coli.

    Science.gov (United States)

    Schmideder, Andreas; Cremer, Johannes H; Weuster-Botz, Dirk

    2016-11-01

    In general, fed-batch processes are applied for recombinant protein production with Escherichia coli (E. coli). However, state of the art methods for identifying suitable reaction conditions suffer from severe drawbacks, i.e. direct transfer of process information from parallel batch studies is often defective and sequential fed-batch studies are time-consuming and cost-intensive. In this study, continuously operated stirred-tank reactors on a milliliter scale were applied to identify suitable reaction conditions for fed-batch processes. Isopropyl β-d-1-thiogalactopyranoside (IPTG) induction strategies were varied in parallel-operated stirred-tank bioreactors to study the effects on the continuous production of the recombinant protein photoactivatable mCherry (PAmCherry) with E. coli. Best-performing induction strategies were transferred from the continuous processes on a milliliter scale to liter scale fed-batch processes. Inducing recombinant protein expression by dynamically increasing the IPTG concentration to 100 µM led to an increase in the product concentration of 21% (8.4 g L -1 ) compared to an implemented high-performance production process with the most frequently applied induction strategy by a single addition of 1000 µM IPGT. Thus, identifying feasible reaction conditions for fed-batch processes in parallel continuous studies on a milliliter scale was shown to be a powerful, novel method to accelerate bioprocess design in a cost-reducing manner. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1426-1435, 2016. © 2016 American Institute of Chemical Engineers.

  9. Comparison of solid-state and submerged-state fermentation for the bioprocessing of switchgrass to ethanol and acetate by Clostridium phytofermentans.

    Science.gov (United States)

    Jain, Abhiney; Morlok, Charles K; Henson, J Michael

    2013-01-01

    The conversion of sustainable energy crops using microbiological fermentation to biofuels and bioproducts typically uses submerged-state processes. Alternatively, solid-state fermentation processes have several advantages when compared to the typical submerged-state processes. This study compares the use of solid-state versus submerged-state fermentation using the mesophilic anaerobic bacterium Clostridium phytofermentans in the conversion of switchgrass to the end products of ethanol, acetate, and hydrogen. A shift in the ratio of metabolic products towards more acetate and hydrogen production than ethanol production was observed when C. phytofermentans was grown under solid-state conditions as compared to submerged-state conditions. Results indicated that the end product concentrations (in millimolar) obtained using solid-state fermentation were higher than using submerged-state fermentation. In contrast, the total fermentation products (in weight of product per weight of carbohydrates consumed) and switchgrass conversion were higher for submerged-state fermentation. The conversion of xylan was greater than glucan conversion under both fermentation conditions. An initial pH of 7 and moisture content of 80 % resulted in maximum end products formation. Scanning electron microscopy study showed the presence of biofilm formed by C. phytofermentans growing on switchgrass under submerged-state fermentation whereas bacterial cells attached to surface and no apparent biofilm was observed when grown under solid-state fermentation. To our knowledge, this is the first study reporting consolidated bioprocessing of a lignocellulosic substrate by a mesophilic anaerobic bacterium under solid-state fermentation conditions.

  10. Analytic Provenance Datasets: A Data Repository of Human Analysis Activity and Interaction Logs

    OpenAIRE

    Mohseni, Sina; Pachuilo, Andrew; Nirjhar, Ehsanul Haque; Linder, Rhema; Pena, Alyssa; Ragan, Eric D.

    2018-01-01

    We present an analytic provenance data repository that can be used to study human analysis activity, thought processes, and software interaction with visual analysis tools during exploratory data analysis. We conducted a series of user studies involving exploratory data analysis scenario with textual and cyber security data. Interactions logs, think-alouds, videos and all coded data in this study are available online for research purposes. Analysis sessions are segmented in multiple sub-task ...

  11. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  12. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  13. Design of a Ku band miniature multiple beam klystron

    Energy Technology Data Exchange (ETDEWEB)

    Bandyopadhyay, Ayan Kumar, E-mail: ayan.bandyopadhyay@gmail.com; Pal, Debasish; Kant, Deepender [Microwave Tubes Division, CSIR-CEERI, Pilani, Rajasthan-333031 (India); Saini, Anil; Saha, Sukalyan; Joshi, Lalit Mohan

    2016-03-09

    The design of a miniature multiple beam klystron (MBK) working in the Ku-band frequency range is presented in this article. Starting from the main design parameters, design of the electron gun, the input and output couplers and radio frequency section (RF-section) are presented. The design methodology using state of the art commercial electromagnetic design tools, analytical formulae as well as noncommercial design tools are briefly presented in this article.

  14. FEASIBILITY OF INVESTMENT IN BUSINESS ANALYTICS

    Directory of Open Access Journals (Sweden)

    Mladen Varga

    2007-12-01

    Full Text Available Trends in data processing for decision support show that business users need business analytics, i.e. analytical applications which incorporate a variety of business oriented data analysis techniques and task-specific knowledge. The paper discusses the feasibility of investment in two models of implementing business analytics: custom development and packed analytical applications. The consequences of both models are shown on two models of business analytics implementation in Croatia.

  15. A characterization of dilation-analytic operators

    International Nuclear Information System (INIS)

    Balslev, E.; Grossmann, A.; Paul, T.

    1986-01-01

    Dilation analytic vectors and operators are characterized in a new representation of quantum mechanical states through functions analytic on the upper half-plane. In this space H o -bounded operators are integral operators and criteria for dilation analyticity are given in terms of analytic continuation outside of the half-plane for functions and for kernels. A sufficient condition is given for an integral operator in momentum space to be dilation-analytic

  16. Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Drouhard, Margaret MEG G [ORNL; Beaver, Justin M [ORNL; Pyle, Joshua M [ORNL; BogenII, Paul L. [Google Inc.

    2015-01-01

    Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction, Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.

  17. A Bioprocessed Polysaccharide from Lentinus edodes Mycelia Cultures with Turmeric Protects Chicks from a Lethal Challenge of Salmonella Gallinarum.

    Science.gov (United States)

    Han, Dalmuri; Lee, Hyung Tae; Lee, June Bong; Kim, Yongbaek; Lee, Sang Jong; Yoon, Jang Won

    2017-02-01

    Our previous studies demonstrated that a bioprocessed polysaccharide (BPP) isolated from Lentinus edodes mushroom mycelia cultures supplemented with black rice bran can protect mice against Salmonella lipopolysaccharide-induced endotoxemia and reduce the mortality from Salmonella Typhimurium infection through upregulated T-helper 1 immunity. Here, we report that a BPP from L. edodes mushroom mycelia liquid cultures supplemented with turmeric (referred to as BPP-turmeric) alters chicken macrophage responses against avian-adapted Salmonella Gallinarum and protects chicks against a lethal challenge from Salmonella Gallinarum. In vitro analyses revealed that the water extract of BPP-turmeric (i) changed the protein expression or secretion profile of Salmonella Gallinarum, although it was not bactericidal, (ii) reduced the phagocytic activity of the chicken-derived macrophage cell line HD-11 when infected with Salmonella Gallinarum, and (iii) significantly activated the transcription expression of interleukin (IL)-1β, IL-10, tumor necrosis factor α, and inducible nitric oxide synthase in response to various Salmonella infections, whereas it repressed that of IL-4, IL-6, interferon-β, and interferon-γ. We also found that BPP-turmeric (0.1 g/kg of feed) as a feed additive provided significant protection to 1-day-old chicks infected with a lethal dose of Salmonella Gallinarum. Collectively, these results imply that BPP-turmeric contains biologically active component(s) that protect chicks against Salmonella Gallinarum infection, possibly by regulating macrophage immune responses. Further studies are needed to evaluate the potential efficacy of BPP-turmeric as a livestock feed additive for the preharvest control of fowl typhoid or foodborne salmonellosis.

  18. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  19. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  20. Selection of power market structure using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Subhes Bhattacharyya; Prasanta Kumar Dey

    2003-01-01

    Selection of a power market structure from the available alternatives is an important activity within an overall power sector reform program. The evaluation criteria for selection are both subjective as well as objective in nature and the selection of alternatives is characterised by their conflicting nature. This study demonstrates a methodology for power market structure selection using the analytic hierarchy process, a multiple attribute decision- making technique, to model the selection methodology with the active participation of relevant stakeholders in a workshop environment. The methodology is applied to a hypothetical case of a State Electricity Board reform in India. (author)

  1. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  2. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  3. Learning Analytics: Readiness and Rewards

    Science.gov (United States)

    Friesen, Norm

    2013-01-01

    This position paper introduces the relatively new field of learning analytics, first by considering the relevant meanings of both "learning" and "analytics," and then by looking at two main levels at which learning analytics can be or has been implemented in educational organizations. Although integrated turnkey systems or…

  4. ON THE POLAR CAP CASCADE PAIR MULTIPLICITY OF YOUNG PULSARS

    Energy Technology Data Exchange (ETDEWEB)

    Timokhin, A. N.; Harding, A. K., E-mail: andrey.timokhin@nasa.gov [Astrophysics Science Division, NASA/Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2015-09-10

    We study the efficiency of pair production in polar caps of young pulsars under a variety of conditions to estimate the maximum possible multiplicity of pair plasma in pulsar magnetospheres. We develop a semi-analytic model for calculation of cascade multiplicity which allows efficient exploration of the parameter space and corroborate it with direct numerical simulations. Pair creation processes are considered separately from particle acceleration in order to assess different factors affecting cascade efficiency, with acceleration of primary particles described by recent self-consistent non-stationary model of pair cascades. We argue that the most efficient cascades operate in the curvature radiation/synchrotron regime, the maximum multiplicity of pair plasma in pulsar magnetospheres is ∼few × 10{sup 5}. The multiplicity of pair plasma in magnetospheres of young energetic pulsars weakly depends on the strength of the magnetic field and the radius of curvature of magnetic field lines and has a stronger dependence on pulsar inclination angle. This result questions assumptions about very high pair plasma multiplicity in theories of pulsar wind nebulae.

  5. Steady State Analysis of Stochastic Systems with Multiple Time Delays

    Science.gov (United States)

    Xu, W.; Sun, C. Y.; Zhang, H. Q.

    In this paper, attention is focused on the steady state analysis of a class of nonlinear dynamic systems with multi-delayed feedbacks driven by multiplicative correlated Gaussian white noises. The Fokker-Planck equations for delayed variables are at first derived by Novikov's theorem. Then, under small delay assumption, the approximate stationary solutions are obtained by the probability density approach. As a special case, the effects of multidelay feedbacks and the correlated additive and multiplicative Gaussian white noises on the response of a bistable system are considered. It is shown that the obtained analytical results are in good agreement with experimental results in Monte Carlo simulations.

  6. Non-destructive isotopic uranium assay by multiple delayed neutron measurements

    International Nuclear Information System (INIS)

    Papadopoulos, N.N.; Tsagas, N.F.

    1991-01-01

    The high accuracy and precision required in nuclear safeguards measurements can be achieved by an improved neutron activation technique based on multiple delayed fission neutron counting under various experimental conditions. For the necessary ultrahigh counting statistics required, cyclic activation of multiple subsamples has been applied. The home-made automated flexible analytical system with neutron flux and spectrum differentiation by irradiation position adjustment and cadmium screening, permits the non-destructive determination of the U235 abundance and the total U element concentration needed in nuclear safeguards sample analysis, with a high throughout and a low operational cost. Careful experimental optimization led to considerable improvement of the results

  7. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    Science.gov (United States)

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  8. Division of Analytical Chemistry, 1998

    DEFF Research Database (Denmark)

    Hansen, Elo Harald

    1999-01-01

    The article recounts the 1998 activities of the Division of Analytical Chemistry (DAC- formerly the Working Party on Analytical Chemistry, WPAC), which body is a division of the Federation of European Chemical Societies (FECS). Elo Harald Hansen is the Danish delegate, representing The Danish...... Chemical Society/The Society for Analytical Chemistry....

  9. Cascades of bioreactors

    NARCIS (Netherlands)

    Gooijer, de C.D.

    1995-01-01

    In this thesis a common phenomenon in bioprocess engineering is described : the execution of a certain bioprocess in more than one bioreactor. Chapter 1, a review, classifies bioprocesses by means of a number of characteristics :
    i) processes with a variable

  10. The Space-Time Cube as part of a GeoVisual Analytics Environment to support the understanding of movement data

    DEFF Research Database (Denmark)

    Kveladze, Irma; Kraak, M. J.; van Elzakker, C. P. J. M.

    2015-01-01

    This paper reports the results of an empirical usability experiment on the performance of the space-time cube in a GeoVisual analytics environment. It was developed to explore movement data based on the requirements of human geographers. The interactive environment consists of multiple coordinated...

  11. Analytics for managers with Excel

    CERN Document Server

    Bell, Peter C

    2013-01-01

    Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization.The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic revie

  12. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    Science.gov (United States)

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  13. Future analytical provision - Relocation of Sellafield Ltd Analytical Services Laboratory

    International Nuclear Information System (INIS)

    Newell, B.

    2015-01-01

    Sellafield Ltd Analytical Services provide an essential view on the environmental, safety, process and high hazard risk reduction performances by analysis of samples. It is the largest and most complex analytical services laboratory in Europe, with 150 laboratories (55 operational) and 350 staff (including 180 analysts). Sellafield Ltd Analytical Services Main Laboratory is in need of replacement. This is due to the age of the facility and changes to work streams. This relocation is an opportunity to -) design and commission bespoke MA (Medium-Active) cells, -) modify HA (High-Active) cell design to facilitate an in-cell laboratory, -) develop non-destructive techniques, -) open light building for better worker morale. The option chosen was to move the activities to the NNL Central laboratory (NNLCL) that is based at Sellafield and is the UK's flagship nuclear research and development facility. This poster gives a time schedule

  14. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    Science.gov (United States)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  15. Association of autoimmune hepatitis and multiple sclerosis: a coincidence?

    Directory of Open Access Journals (Sweden)

    Marta Sofia Mendes Oliveira

    2015-09-01

    Full Text Available Autoimmune hepatitis is a chronic liver inflammation resulting from deregulation of immune tolerance mechanisms. Multiple sclerosis is also an inflammatory disease in which the insulating covers of nerve cells in the brain and spinal cord are damaged. Here we present a case of an 18 year old female with multiple sclerosis was treated with glatiramer acetate and with interferon beta 1a at our hospital. Seven months after initiating treatment, liver dysfunction occurred. Clinical and laboratory findings were suggestive of drug-induced hepatitis, which led to discontinuation of treatment with interferon. Facing a new episode of acute hepatitis one year later, she was subjected to a liver biopsy, and the analysis of autoantibodies was positive for smooth muscle antibodies. Given the diagnosis of autoimmune hepatitis she started therapy with prednisolone and azathioprine, with good clinical and analytical response. Besides, the demyelinating lesions of multiple sclerosis became lower. In conclusion, there are only a few cases that describe the association of autoimmune hepatitis with multiple sclerosis, and there is a chance both diseases have the same autoimmune inflammatory origin.

  16. A fluorescence anisotropy method for measuring protein concentration in complex cell culture media.

    Science.gov (United States)

    Groza, Radu Constantin; Calvet, Amandine; Ryder, Alan G

    2014-04-22

    The rapid, quantitative analysis of the complex cell culture media used in biopharmaceutical manufacturing is of critical importance. Requirements for cell culture media composition profiling, or changes in specific analyte concentrations (e.g. amino acids in the media or product protein in the bioprocess broth) often necessitate the use of complicated analytical methods and extensive sample handling. Rapid spectroscopic methods like multi-dimensional fluorescence (MDF) spectroscopy have been successfully applied for the routine determination of compositional changes in cell culture media and bioprocess broths. Quantifying macromolecules in cell culture media is a specific challenge as there is a need to implement measurements rapidly on the prepared media. However, the use of standard fluorescence spectroscopy is complicated by the emission overlap from many media components. Here, we demonstrate how combining anisotropy measurements with standard total synchronous fluorescence spectroscopy (TSFS) provides a rapid, accurate quantitation method for cell culture media. Anisotropy provides emission resolution between large and small fluorophores while TSFS provides a robust measurement space. Model cell culture media was prepared using yeastolate (2.5 mg mL(-1)) spiked with bovine serum albumin (0 to 5 mg mL(-1)). Using this method, protein emission is clearly discriminated from background yeastolate emission, allowing for accurate bovine serum albumin (BSA) quantification over a 0.1 to 4.0 mg mL(-1) range with a limit of detection (LOD) of 13.8 μg mL(-1). Copyright © 2014. Published by Elsevier B.V.

  17. Quality Indicators for Learning Analytics

    Science.gov (United States)

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  18. Protection of toroidal field coils using multiple circuits

    International Nuclear Information System (INIS)

    Thome, R.J.; Langton, W.G.; Mann, W.R.; Pillsbury, R.D.; Tarrh, J.M.

    1983-01-01

    The protection of toroidal field (TF) coils using multiple circuits is described. The discharge of a single-circuit TF system is given for purposes of definition. Two-circuit TF systems are analyzed and the results presented analytically and graphically. Induced currents, maximum discharge voltages, and discharge time constants are compared to the single-circuit system. Three-circuit TF systems are analyzed. In addition to induced currents, maximum discharge voltages, and time constants, several different discharge scenarios are included. The impacts of having discharge rates versus final maximum coil temperatures as requirements are examined. The out-of-plane forces which occur in the three-circuit system are analyzed using an approximate model. The analysis of multiplecircuit TF systems is briefly described and results for a Toroidal Fusion Core Experiment (TFCX) scale device are given based on computer analysis. The advantages and disadvantages of using multiple-circuit systems are summarized and discussed. The primary disadvantages of multiple circuits are the increased circuit complexity and potential for out-of-plane forces. These are offset by the substantial reduction in maximum discharge voltages, as well as other design options which become available when using multiple circuits

  19. Synthesis for robust synchronization of chaotic systems under output feedback control with multiple random delays

    International Nuclear Information System (INIS)

    Wen Guilin; Wang Qingguo; Lin Chong; Han Xu; Li Guangyao

    2006-01-01

    Synchronization under output feedback control with multiple random time delays is studied, using the paradigm in nonlinear physics-Chua's circuit. Compared with other synchronization control methods, output feedback control with multiple random delay is superior for a realistic synchronization application to secure communications. Sufficient condition for global stability of delay-dependent synchronization is established based on the LMI technique. Numerical simulations fully support the analytical approach, in spite of the random delays

  20. Analytics: What We're Hearing

    Science.gov (United States)

    Oblinger, Diana

    2012-01-01

    Over the last few months, EDUCAUSE has been focusing on analytics. As people hear from experts, meet with association members, and watch the marketplace evolve, a number of common themes are emerging. Conversations have shifted from "What is analytics?" to "How do we get started, and how do we use analytics well?" What people are hearing from…

  1. Analytical performance specifications for external quality assessment - definitions and descriptions.

    Science.gov (United States)

    Jones, Graham R D; Albarede, Stephanie; Kesseler, Dagmar; MacKenzie, Finlay; Mammen, Joy; Pedersen, Morten; Stavelin, Anne; Thelen, Marc; Thomas, Annette; Twomey, Patrick J; Ventura, Emma; Panteghini, Mauro

    2017-06-27

    External Quality Assurance (EQA) is vital to ensure acceptable analytical quality in medical laboratories. A key component of an EQA scheme is an analytical performance specification (APS) for each measurand that a laboratory can use to assess the extent of deviation of the obtained results from the target value. A consensus conference held in Milan in 2014 has proposed three models to set APS and these can be applied to setting APS for EQA. A goal arising from this conference is the harmonisation of EQA APS between different schemes to deliver consistent quality messages to laboratories irrespective of location and the choice of EQA provider. At this time there are wide differences in the APS used in different EQA schemes for the same measurands. Contributing factors to this variation are that the APS in different schemes are established using different criteria, applied to different types of data (e.g. single data points, multiple data points), used for different goals (e.g. improvement of analytical quality; licensing), and with the aim of eliciting different responses from participants. This paper provides recommendations from the European Federation of Laboratory Medicine (EFLM) Task and Finish Group on Performance Specifications for External Quality Assurance Schemes (TFG-APSEQA) and on clear terminology for EQA APS. The recommended terminology covers six elements required to understand APS: 1) a statement on the EQA material matrix and its commutability; 2) the method used to assign the target value; 3) the data set to which APS are applied; 4) the applicable analytical property being assessed (i.e. total error, bias, imprecision, uncertainty); 5) the rationale for the selection of the APS; and 6) the type of the Milan model(s) used to set the APS. The terminology is required for EQA participants and other interested parties to understand the meaning of meeting or not meeting APS.

  2. Vibration of a string against multiple spring-mass-damper stoppers

    Science.gov (United States)

    Shin, Ji-Hwan; Talib, Ezdiani; Kwak, Moon K.

    2018-02-01

    When a building sways due to strong wind or an earthquake, the elevator rope can undergo resonance, resulting in collision with the hoist-way wall. In this study, a hard stopper and a soft stopper comprised of a spring-mass-damper system installed along the hoist-way wall were considered to prevent the string from undergoing excessive vibrations. The collision of the string with multiple hard stoppers and multiple spring-mass-damper stoppers was investigated using an analytical method. The result revealed new formulas and computational algorithms that are suitable for simulating the vibration of the string against multiple stoppers. The numerical results show that the spring-mass-damper stopper is more effective in suppressing the vibrations of the string and reducing structural failure. The proposed algorithms were shown to be efficient to simulate the motion of the string against a vibration stopper.

  3. Group Analytic Psychotherapy in Brazil.

    Science.gov (United States)

    Penna, Carla; Castanho, Pablo

    2015-10-01

    Group analytic practice in Brazil began quite early. Highly influenced by the Argentinean Pichon-Rivière, it enjoyed a major development from the 1950s to the early 1980s. Beginning in the 1970s, different factors undermined its development and eventually led to its steep decline. From the mid 1980s on, the number of people looking for either group analytic psychotherapy or group analytic training decreased considerably. Group analytic psychotherapy societies struggled to survive and most of them had to close their doors in the 1990s and the following decade. Psychiatric reform and the new public health system have stimulated a new demand for groups in Brazil. Developments in the public and not-for-profit sectors, combined with theoretical and practical research in universities, present promising new perspectives for group analytic psychotherapy in Brazil nowadays.

  4. A Case Study of Resources Management Planning with Multiple Objectives and Projects

    Science.gov (United States)

    David L. Peterson; David G. Silsbee; Daniel L. Schmoldt

    1995-01-01

    Each National Park Service unit in the United States produces a resources management plan (RMP) every four years or less. The plans commit budgets and personnel to specific projects for four years, but they are prepared with little quantitative and analytical rigor and without formal decisionmaking tools. We have previously described a multiple objective planning...

  5. A multiple-choice knapsack based algorithm for CDMA downlink rate differentiation under uplink coverage restrictions

    NARCIS (Netherlands)

    Endrayanto, A.I.; Bumb, A.F.; Boucherie, Richardus J.

    2004-01-01

    This paper presents an analytical model for downlink rate allocation in Code Division Multiple Access (CDMA) mobile networks. By discretizing the coverage area into small segments, the transmit power requirements are characterized via a matrix representation that separates user and system

  6. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  7. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  8. Learning analytics dashboard applications

    NARCIS (Netherlands)

    Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L.

    2013-01-01

    This article introduces learning analytics dashboards that visualize learning traces for learners and teachers. We present a conceptual framework that helps to analyze learning analytics applications for these kinds of users. We then present our own work in this area and compare with 15 related

  9. Hanford Analytical Services Management: One of the keys to effectively managing the Hanford Site in an environment of competing resources and priorities

    International Nuclear Information System (INIS)

    Wanek, D.M.; Mooers, G.C.; Schubert, S.A.

    1994-02-01

    The Quality Improvement Team recognized that a true partnership between RL and the Hanford Site contractors had to be established to (1) identify what the analytical needs were for the site, both short and long term, (2) determine how to meet those needs, whether by using onsite capability or contracting offsite services, and (3) ensure that all analytical services meet the high level of quality demanded by the end users of the data. The Hanford Analytical Services Management (HASM) organization was established from this concept. What makes HASM unique and virtually guarantees success is that all the participants within HASM, site contractors and RL, have parity. This ensures that the best interests of the Hanford Site are implemented and minimizes the normal parochialism when multiple contractors are competing for the same work. The HASM concept provides for consistent management to balance the analytical needs with the limited resources identified for analytical services at the Hanford Site. By contracting for analytical services, HASM provides a mechanism to meet site goals of increased commercialization

  10. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  11. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy

    Science.gov (United States)

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-03-01

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  12. Analytic Hypoellipticity and the Treves Conjecture

    Directory of Open Access Journals (Sweden)

    Marco Mughetti

    2016-12-01

    Full Text Available We are concerned with the problem of the analytic hypoellipticity; precisely, we focus on the real analytic regularity of the solutions of sums of squares with real analytic coefficients. Treves conjecture states that an operator of this type is analytic hypoelliptic if and only if all the strata in the Poisson-Treves stratification are symplectic. We discuss a model operator, P, (firstly appeared and studied in [3] having a single symplectic stratum and prove that it is not analytic hypoelliptic. This yields a counterexample to the sufficient part of Treves conjecture; the necessary part is still an open problem.

  13. Signals: Applying Academic Analytics

    Science.gov (United States)

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  14. Analytical Chemistry and Measurement Science: (What Has DOE Done for Analytical Chemistry?)

    Science.gov (United States)

    Shults, W. D.

    1989-04-01

    Over the past forty years, analytical scientists within the DOE complex have had a tremendous impact on the field of analytical chemistry. This paper suggests six "high impact" research/development areas that either originated within or were brought to maturity within the DOE laboratories. "High impact" means they lead to new subdisciplines or to new ways of doing business.

  15. Double-multiple streamtube model for studying vertical-axis wind turbines

    Science.gov (United States)

    Paraschivoiu, Ion

    1988-08-01

    This work describes the present state-of-the-art in double-multiple streamtube method for modeling the Darrieus-type vertical-axis wind turbine (VAWT). Comparisons of the analytical results with the other predictions and available experimental data show a good agreement. This method, which incorporates dynamic-stall and secondary effects, can be used for generating a suitable aerodynamic-load model for structural design analysis of the Darrieus rotor.

  16. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    Science.gov (United States)

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  17. Multivariate curve resolution applied to infrared reflection measurements of soil contaminated with an organophosphorus analyte.

    Science.gov (United States)

    Gallagher, Neal B; Blake, Thomas A; Gassman, Paul L; Shaver, Jeremy M; Windig, Willem

    2006-07-01

    Multivariate curve resolution (MCR) is a powerful technique for extracting chemical information from measured spectra of complex mixtures. A modified MCR technique that utilized both measured and second-derivative spectra to account for observed sample-to-sample variability attributable to changes in soil reflectivity was used to estimate the spectrum of dibutyl phosphate (DBP) adsorbed on two different soil types. This algorithm was applied directly to measurements of reflection spectra of soils coated with analyte without resorting to soil preparations such as grinding or dilution in potassium bromide. The results provided interpretable spectra that can be used to guide strategies for detection and classification of organic analytes adsorbed on soil. Comparisons to the neat DBP liquid spectrum showed that the recovered analyte spectra from both soils showed spectral features from methyl, methylene, hydroxyl, and P=O functional groups, but most conspicuous was the absence of the strong PO-(CH2)3CH3 stretch absorption at 1033 cm(-1). These results are consistent with those obtained previously using extended multiplicative scatter correction.

  18. Quasiperiodic one-dimensional photonic crystals with adjustable multiple photonic bandgaps.

    Science.gov (United States)

    Vyunishev, Andrey M; Pankin, Pavel S; Svyakhovskiy, Sergey E; Timofeev, Ivan V; Vetrov, Stepan Ya

    2017-09-15

    We propose an elegant approach to produce photonic bandgap (PBG) structures with multiple photonic bandgaps by constructing quasiperiodic photonic crystals (QPPCs) composed of a superposition of photonic lattices with different periods. Generally, QPPC structures exhibit both aperiodicity and multiple PBGs due to their long-range order. They are described by a simple analytical expression, instead of quasiperiodic tiling approaches based on substitution rules. Here we describe the optical properties of QPPCs exhibiting two PBGs that can be tuned independently. PBG interband spacing and its depth can be varied by choosing appropriate reciprocal lattice vectors and their amplitudes. These effects are confirmed by the proof-of-concept measurements made for the porous silicon-based QPPC of the appropriate design.

  19. Bessel Fourier Orientation Reconstruction (BFOR): An Analytical Diffusion Propagator Reconstruction for Hybrid Diffusion Imaging and Computation of q-Space Indices

    Science.gov (United States)

    Hosseinbor, A. Pasha; Chung, Moo K.; Wu, Yu-Chien; Alexander, Andrew L.

    2012-01-01

    The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents. The EAP can thus provide richer information about complex tissue microstructure properties than the orientation distribution function (ODF), an angular feature of the EAP. Recently, several analytical EAP reconstruction schemes for multiple q-shell acquisitions have been proposed, such as diffusion propagator imaging (DPI) and spherical polar Fourier imaging (SPFI). In this study, a new analytical EAP reconstruction method is proposed, called Bessel Fourier orientation reconstruction (BFOR), whose solution is based on heat equation estimation of the diffusion signal for each shell acquisition, and is validated on both synthetic and real datasets. A significant portion of the paper is dedicated to comparing BFOR, SPFI, and DPI using hybrid, non-Cartesian sampling for multiple b-value acquisitions. Ways to mitigate the effects of Gibbs ringing on EAP reconstruction are also explored. In addition to analytical EAP reconstruction, the aforementioned modeling bases can be used to obtain rotationally invariant q-space indices of potential clinical value, an avenue which has not yet been thoroughly explored. Three such measures are computed: zero-displacement probability (Po), mean squared displacement (MSD), and generalized fractional anisotropy (GFA). PMID:22963853

  20. Analysis of a production/inventory system with multiple retailers

    OpenAIRE

    Noblesse, Ann; Boute, Robert; Lambrecht, Marc; Van Houdt, B.

    2014-01-01

    We study a production/inventory system with one manufacturing plant and multiple retailers. Production lead times at the plant are stochastic and endogenously determined by the orders placed by the different retailers. Assuming stochastic (phase-type distributed) production and setup times, we make use of matrix analytic techniques to develop a queuing model that is capable to compute the distribution of the time orders spend in the production facility, depending on the retailer’s lot s...

  1. Organizational Models for Big Data and Analytics

    Directory of Open Access Journals (Sweden)

    Robert L. Grossman

    2014-04-01

    Full Text Available In this article, we introduce a framework for determining how analytics capability should be distributed within an organization. Our framework stresses the importance of building a critical mass of analytics staff, centralizing or decentralizing the analytics staff to support business processes, and establishing an analytics governance structure to ensure that analytics processes are supported by the organization as a whole.

  2. Reconfiguring Urban Sustainability Transitions, Analysing Multiplicity

    Directory of Open Access Journals (Sweden)

    Mike Hodson

    2017-02-01

    Full Text Available Cities, and the networked infrastructures that sustain urban life, are seen as crucial sites for creating more sustainable futures. Yet, although there are many plans, the realisation of sustainable urban infrastructures on the ground is uneven. To develop better ways of understanding why this is the case, the paper makes a conceptual contribution by engaging with current understanding of urban sustainability transitions, using urban sustainable mobility as a reference point. It extends these insights to argue that urban transitions are not about technological or social innovation per se, but about how multiple innovations are experimented with, combined and reconfigured in existing urban contexts and how such processes are governed. There are potentially many ways in which urban sustainable mobility can be reconfigured contextually. Innovation is in the particular form of reconfiguration rather than individual technologies. To make analytical sense of this multiplicity, a preliminary framework is developed that offers the potential to think about urban transitions as contextual and reconfigurational. We argue that there is a need to embrace multiplicity and to understand its relationships to forms of reconfiguration, through empirical exploration and further theoretical and conceptual development. The preliminary framework is a contribution to doing so and we set out future directions for research.

  3. Double-multiple streamtube model for Darrieus in turbines

    Science.gov (United States)

    Paraschivoiu, I.

    1981-01-01

    An analytical model is proposed for calculating the rotor performance and aerodynamic blade forces for Darrieus wind turbines with curved blades. The method of analysis uses a multiple-streamtube model, divided into two parts: one modeling the upstream half-cycle of the rotor and the other, the downstream half-cycle. The upwind and downwind components of the induced velocities at each level of the rotor were obtained using the principle of two actuator disks in tandem. Variation of the induced velocities in the two parts of the rotor produces larger forces in the upstream zone and smaller forces in the downstream zone. Comparisons of the overall rotor performance with previous methods and field test data show the important improvement obtained with the present model. The calculations were made using the computer code CARDAA developed at IREQ. The double-multiple streamtube model presented has two major advantages: it requires a much shorter computer time than the three-dimensional vortex model and is more accurate than multiple-streamtube model in predicting the aerodynamic blade loads.

  4. Analytical spectroscopy. Analytical Chemistry Symposia Series, Volume 19

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1984-01-01

    This book contains papers covering several fields in analytical chemistry including lasers, mass spectrometry, inductively coupled plasma, activation analysis and emission spectroscopy. Separate abstracting and indexing was done for 64 papers in this book

  5. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  6. Application of holographic sub-wavelength diffraction gratings for monitoring of kinetics of bioprocesses

    Science.gov (United States)

    Tamulevičius, Tomas; Šeperys, Rimas; Andrulevičius, Mindaugas; Kopustinskas, Vitoldas; Meškinis, Šarūnas; Tamulevičius, Sigitas; Mikalayeva, Valeryia; Daugelavičius, Rimantas

    2012-09-01

    In this work we present a refractive index (RI) sensor based on a sub-wavelength holographic diffraction grating. The sensor chip was fabricated by dry etching of the finely spaced (d = 428 nm) diffraction grating in SiOx doped diamond like carbon (DLC) film. It is shown that employing a fabricated sensor chip, and using the proposed method of analysis of data, one can inspect kinetics of processes in liquids occurring in the vicinity of the grating surface. The method is based on the spectral composition analysis of polarized polychromatic light reflected from the sub-wavelength diffraction grating. The RI measurement system was tested with different model liquid analytes including 25 wt.%, 50 wt.% sugar water solutions, 10 °C, 50 °C distilled water, also Gram-positive bacteria Bacillus subtilis interaction with ion-permeable channels forming antibiotic gramicidin D and a murolytic enzyme lysozyme. Analysis of the data set of specular reflection spectra enabled us to follow the kinetics of the RI changes in the analyte with millisecond resolution. Detectable changes in the effective RI were not worse than Δn = 10-4.

  7. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  8. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  9. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  10. Analytical methods for the evaluation of melamine contamination.

    Science.gov (United States)

    Cantor, Stuart L; Gupta, Abhay; Khan, Mansoor A

    2014-02-01

    There is an urgent need for the analysis of melamine in the global pharmaceutical supply chain to detect economically motivated adulteration or unintentional contamination using a simple, nondestructive analytical technique that confirms the extent of adulteration in a shorter time period. In this work, different analytical techniques (thermal analysis, X-ray diffraction, Fourier transform infrared (FT-IR), FT-Raman, and near-infrared (NIR) spectroscopy) were evaluated for their ability to detect a range of melamine levels in gelatin. While FT-IR and FT-Raman provided qualitative assessment of melamine contamination or adulteration, powder X-ray diffraction and NIR were able to detect and quantify the presence of melamine at levels as low as 1.0% w/w. Multivariate analysis of the NIR data yielded the most accurate model when three principal components were used. Data were pretreated using standard normal variate transformation to remove multiplicative interferences of scatter and particle size. The model had a root-mean-square error of calibration of 2.4 (R(2) = 0.99) and root-mean square error of prediction of 2.5 (R(2) = 0.96). The value of the paired t test for actual and predicted samples (1%-50% w/w) was 0.448 (p 5), further indicating the robustness of the model. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  11. Exploring the Managerial Dilemmas Encountered by Advanced Analytical Equipment Providers in Developing Service-led Growth Strategies

    DEFF Research Database (Denmark)

    Raja, Jawwad; Frandsen, Thomas; Mouritsen, Jan

    2017-01-01

    This paper examines the dilemmas encountered by manufacturers of advanced analytical equipment in developing service-led growth strategies to expand their business in pursuit of more attractive revenue models. It does so by adopting a case-based research approach. The findings detail the challenges...... faced in providing advanced services to customers’ R & D functions, while simultaneously attempting to scale up these services for a production context. The emergent complexities of operating in multiple arenas in order to explore and exploit technologies in different contexts—along the three...... trajectories of serviceability, scalability and solutions—with a view to expanding markets and developing solution-based business models, are discussed. It is argued that manufacturers of analytical equipment encounter certain dilemmas, as managing the different trajectories involves different needs...

  12. Analytical X-ray line profile analysis based upon correlated dislocations

    International Nuclear Information System (INIS)

    Rao, S.; Houska, C.R.

    1988-01-01

    Recent advances describing X-ray line profiles analytically, in terms of a minimum number of parameters, are related to a theory based upon correlated dislocations. It is shown that a multiple convolution approach, based upon the Warren-Averbach (W-A) analysis, leads to a form that closely approximates the strain coefficient obtained by Krivoglaz, Martynenko and Ryaboshopka. This connection enables one to determine the dislocation density and the ratio of the correlation range parameter to the mean particle size. These two results are obtained most accurately from previous analytical approaches which make use of a statistical least-squares analysis. The W-A Fourier-series approach provides redundant information and does not focus on the critical parameters that relate to dislocation theory. Results so far are limited to b.c.c. materials. Results for cold-worked W, Mo, Nb, Cr and V are compared with highly imperfect sputtered films of Mo. A major difference is relatable to higher correlation of dislocations in cold-worked metals than is found in sputtered films deposited at low temperatures. However, in each case, the dislocation density is high. (orig.)

  13. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Heavy element stable isotope ratios. Analytical approaches and applications

    International Nuclear Information System (INIS)

    Tanimizu, Masaharu; Sohrin, Yoshiki; Hirata, Takafumi

    2013-01-01

    Continuous developments in inorganic mass spectrometry techniques, including a combination of an inductively coupled plasma ion source and a magnetic sector-based mass spectrometer equipped with a multiple-collector array, have revolutionized the precision of isotope ratio measurements, and applications of inorganic mass spectrometry for biochemistry, geochemistry, and marine chemistry are beginning to appear on the horizon. Series of pioneering studies have revealed that natural stable isotope fractionations of many elements heavier than S (e.g., Fe, Cu, Zn, Sr, Ce, Nd, Mo, Cd, W, Tl, and U) are common on Earth, and it had been widely recognized that most physicochemical reactions or biochemical processes induce mass-dependent isotope fractionation. The variations in isotope ratios of the heavy elements can provide new insights into past and present biochemical and geochemical processes. To achieve this, the analytical community is actively solving problems such as spectral interference, mass discrimination drift, chemical separation and purification, and reduction of the contamination of analytes. This article describes data calibration and standardization protocols to allow interlaboratory comparisons or to maintain traceability of data, and basic principles of isotope fractionation in nature, together with high-selectivity and high-yield chemical separation and purification techniques for stable isotope studies.

  15. Development of an asymmetric multiple-position neutron source (AMPNS) method to monitor the criticality of a degraded reactor core

    International Nuclear Information System (INIS)

    Kim, S.S.; Levine, S.H.

    1985-01-01

    An analytical/experimental method has been developed to monitor the subcritical reactivity and unfold the k/sub infinity/ distribution of a degraded reactor core. The method uses several fixed neutron detectors and a Cf-252 neutron source placed sequentially in multiple positions in the core. Therefore, it is called the Asymmetric Multiple Position Neutron Source (AMPNS) method. The AMPNS method employs nucleonic codes to analyze the neutron multiplication of a Cf-252 neutron source. An optimization program, GPM, is utilized to unfold the k/sub infinity/ distribution of the degraded core, in which the desired performance measure minimizes the error between the calculated and the measured count rates of the degraded reactor core. The analytical/experimental approach is validated by performing experiments using the Penn State Breazeale TRIGA Reactor (PSBR). A significant result of this study is that it provides a method to monitor the criticality of a damaged core during the recovery period

  16. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Paper-Based Analytical Device for Zinc Ion Quantification in Water Samples with Power-Free Analyte Concentration

    Directory of Open Access Journals (Sweden)

    Hiroko Kudo

    2017-04-01

    Full Text Available Insufficient sensitivity is a general issue of colorimetric paper-based analytical devices (PADs for trace analyte detection, such as metal ions, in environmental water. This paper demonstrates the colorimetric detection of zinc ions (Zn2+ on a paper-based analytical device with an integrated analyte concentration system. Concentration of Zn2+ ions from an enlarged sample volume (1 mL has been achieved with the aid of a colorimetric Zn2+ indicator (Zincon electrostatically immobilized onto a filter paper substrate in combination with highly water-absorbent materials. Analyte concentration as well as sample pretreatment, including pH adjustment and interferent masking, has been elaborated. The resulting device enables colorimetric quantification of Zn2+ in environmental water samples (tap water, river water from a single sample application. The achieved detection limit of 0.53 μM is a significant improvement over that of a commercial colorimetric Zn2+ test paper (9.7 μM, demonstrating the efficiency of the developed analyte concentration system not requiring any equipment.

  18. Analytic structure of the wave function for a hydrogen atom in an analytic potential

    International Nuclear Information System (INIS)

    Hill, R.N.

    1984-01-01

    The rate of convergence of an approximate method for solving Schroedinger's equation depends on the ability of the approximating sequence to mimic the analytic structure of the unknown exact wave function. Thus a knowledge of the analytic structure of the wave function can be of great value when approximation schemes are designed. Consider the Schroedinger equation [- 1/2 del 2 -r -1 +V(r)]Psi(r) = EPsi(r) for a hydrogen atom in a potential V(r). The general theory of elliptic partial differential equations implies that Psi is analytic at regular points, but no general theory is available at singular points. The present paper investigates the Coulomb singular point at r = 0 and shows that, if V(r) = V 1 (x, y, z)+rV 2 (x, y, z) where V 1 and V 2 are analytic functions of x, y, z at x = y = z = 0, then the wave function has the form Psi(r) = Psi 1 (x, y, z)+rPsi 2 (x, y, z) where Psi 1 and Psi 2 are analytic functions of x, y, z at x = y = z = 0

  19. Seamless Digital Environment - Plan for Data Analytics Use Case Study

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    2016-01-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a ''one stop shop'' application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the ''siloed'' data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team's control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  20. Characterization of simultaneous heat and mass transfer phenomena for water vapour condensation on a solid surface in an abiotic environment--application to bioprocesses.

    Science.gov (United States)

    Tiwari, Akhilesh; Kondjoyan, Alain; Fontaine, Jean-Pierre

    2012-07-01

    The phenomenon of heat and mass transfer by condensation of water vapour from humid air involves several key concepts in aerobic bioreactors. The high performance of bioreactors results from optimised interactions between biological processes and multiphase heat and mass transfer. Indeed in various processes such as submerged fermenters and solid-state fermenters, gas/liquid transfer need to be well controlled, as it is involved at the microorganism interface and for the control of the global process. For the theoretical prediction of such phenomena, mathematical models require heat and mass transfer coefficients. To date, very few data have been validated concerning mass transfer coefficients from humid air inflows relevant to those bioprocesses. Our study focussed on the condensation process of water vapour and developed an experimental set-up and protocol to study the velocity profiles and the mass flux on a small size horizontal flat plate in controlled environmental conditions. A closed circuit wind tunnel facility was used to control the temperature, hygrometry and hydrodynamics of the flow. The temperature of the active surface was controlled and kept isothermal below the dew point to induce condensation, by the use of thermoelectricity. The experiments were performed at ambient temperature for a relative humidity between 35-65% and for a velocity of 1.0 ms⁻¹. The obtained data are analysed and compared to available theoretical calculations on condensation mass flux.

  1. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    Science.gov (United States)

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Learning Analytics Considered Harmful

    Science.gov (United States)

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  3. Multiple myeloma presenting with a maxillary lesion as the first sign

    Energy Technology Data Exchange (ETDEWEB)

    Ramaiah, Kiran Kumar Kotagudda; Joshi, Vajendra; Thayi, Shilpa Ravishankar; Sathyanarayana, Pathalapate; Patil, Prashant [Dept. of Oral Medicine and Radiology, Navodaya Dental College and Hospital, Raichur (Korea, Republic of); Ahmed, Zaheer [Dept. of Public Health Dentistry, Navodaya Dental College and Hospital, Raichur (Korea, Republic of)

    2015-03-15

    Multiple myeloma is a clonal neoplastic proliferation of terminally differentiated B-lymphocytes involving the skeletal system in a multifocal fashion. Its oral manifestations are less common in the maxilla than in the mandible due to the lower amount of hemopoietic bone marrow in the maxilla. We report the case of a 50-year-old man who presented with a mass in the left maxillary alveolar region with tooth mobility. The mass had become enlarged after the teeth were extracted 15 days previously. Radiographs demonstrated multiple punched-out radiolucent lesions in the skull and pelvic region. Computed tomography images showed a soft tissue density mass in the left maxilla, eroding the floor and walls of the maxillary sinus. Although several analytical techniques were used to characterize the lesion, it was finally confirmed as multiple myeloma through immunohistochemistry.

  4. Rorty, Pragmatism, and Analytic Philosophy

    Directory of Open Access Journals (Sweden)

    Cheryl Misak

    2013-07-01

    Full Text Available One of Richard Rorty's legacies is to have put a Jamesian version of pragmatism on the contemporary philosophical map. Part of his argument has been that pragmatism and analytic philosophy are set against each other, with pragmatism almost having been killed off by the reigning analytic philosophy. The argument of this paper is that there is a better and more interesting reading of both the history of pragmatism and the history of analytic philosophy.

  5. Capillary gel electrophoresis-coupled aptamer enzymatic cleavage protection strategy for the simultaneous detection of multiple small analytes.

    Science.gov (United States)

    Perrier, Sandrine; Zhu, Zhenyu; Fiore, Emmanuelle; Ravelet, Corinne; Guieu, Valérie; Peyrin, Eric

    2014-05-06

    This novel, multi small-analyte sensing strategy is the result of combining the target-induced aptamer enzymatic protection approach with the CGE-LIF (capillary gel electrophoresis with laser-induced fluorescence) technique. The implemented assay principle is based on an analysis of the phosphodiesterase I (PDE I)-mediated size variation of a fluorescein-labeled aptamer (FApt), the enzyme catalyzing the removal of nucleotides from DNA in the 3' to 5' direction. In the absence of the target, the unfolded aptamer was enzymatically cleaved into short DNA fragments. Upon target binding, the DNA substrate was partially protected against enzymatic hydrolysis. The amount of bound aptamer remaining after the exonuclease reaction was proportional to the concentration of the target. The CGE technique, which was used to determine the separation of FApt species from DNA digested products, permitted the quantification of adenosine (A), ochratoxin A (O), and tyrosinamide (T) under the same optimized enzymatic conditions. This assay strategy was subsequently applied to the simultaneous detection of A, O, and T in a single capillary under buffered conditions using corresponding FApt probes of different lengths (23, 36, and 49 nucleotides, respectively). Additionally, the detection of these three small molecules was successfully achieved in a complex medium (diluted, heat-treated human serum) showing a good recovery. It is worth noting that the multiplexed analysis was accomplished for targets with different charge states by using aptamers possessing various structural features. This sensing platform constitutes a rationalized and reliable approach with an expanded potential for a high-throughput determination of small analytes in a single capillary.

  6. World, Time And Anxiety. Heidegger’s Existential Analytic And Psychiatry

    Directory of Open Access Journals (Sweden)

    Brencio Francesca

    2014-12-01

    Full Text Available Martin Heidegger was one of the most influential but also criticized philosophers of the XX century. With Being and Time 1927 he sets apart his existential analytic from psychology as well as from anthropology and from the other human sciences that deny the ontological foundation, overcoming the Cartesian dualism in search of the ontological unit of an articulated multiplicity, as human being is. Heidegger’s Dasein Analytic defines the fundamental structures of human being such as being-in-the-world, a unitary structure that discloses the worldhood of the world; the modes of being (Seinsweisen, such as fear (Furcht and anxiety (Angst; and the relationship between existence and time. In his existential analytic, anxiety is one of the fundamental moods (Grundbefindlichkeit and it plays a pivotal role in the relationship of Dasein with time and world. The paper firstly focuses on the modes of being, underlining the importance of anxiety for the constitution of human being; secondly, it shows the relationship between anxiety and the world, and anxiety and time: rejecting both the Aristotelian description of time, as a sequence of moments that informs our common understanding of time, and the Augustine’s mental account of inner time, Heidegger considers temporality under a transcendental point of view. Temporality is ek-static, it is a process through which human being comes toward and back to itself, letting itself encounter the world and the entities. The transcendental interpretation of time provided by Heidegger may give its important contribution to psychopathology.

  7. Analytical calculation of dE/dx cluster-charge loss due to threshold effects

    International Nuclear Information System (INIS)

    Brady, F.P.; Dunn, J.

    1997-01-01

    This letter presents a simple analytical approximation which allows one to estimate the effect of ADC threshold on the measured cluster-charge size as used for dE/dx determinations. The idea is to gain some intuitive understanding of the cluster-charge loss and not to replace more accurate simulations. The method is applied to the multiple sampling measurements of energy loss in the main time projection chambers (TPCs) of the NA49 experiment at CERN SPS. The calculations are in reasonable agreement with data. (orig.)

  8. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  9. Meaning Making through Multiple Modalities in a Biology Classroom: A Multimodal Semiotics Discourse Analysis

    Science.gov (United States)

    Jaipal, Kamini

    2010-01-01

    The teaching of science is a complex process, involving the use of multiple modalities. This paper illustrates the potential of a multimodal semiotics discourse analysis framework to illuminate meaning-making possibilities during the teaching of a science concept. A multimodal semiotics analytical framework is developed and used to (1) analyze the…

  10. Conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews depend on the analytic approach for comparing reported information to reference information.

    Science.gov (United States)

    Baxter, Suzanne Domel; Smith, Albert F; Hardin, James W; Nichols, Michele D

    2007-04-01

    Validation study data are used to illustrate that conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews (ie, time) depend on the analytic approach for comparing reported and reference information-conventional, which disregards accuracy of reported items and amounts, or reporting-error-sensitive, which classifies reported items as matches (eaten) or intrusions (not eaten), and amounts as corresponding or overreported. Children were observed eating school meals on 1 day (n=12), or 2 (n=13) or 3 (n=79) nonconsecutive days separated by >or=25 days, and interviewed in the morning after each observation day about intake the previous day. Reference (observed) and reported information were transformed to energy and macronutrients (ie, protein, carbohydrate, and fat), and compared. For energy and each macronutrient: report rates (reported/reference), correspondence rates (genuine accuracy measures), and inflation ratios (error measures). Mixed-model analyses. Using the conventional approach for analyzing energy and macronutrients, report rates did not vary systematically over interviews (all four P values >0.61). Using the reporting-error-sensitive approach for analyzing energy and macronutrients, correspondence rates increased over interviews (all four P values macronutrients improved over time, but the conventional approach masked improvements and overestimated accuracy. The reporting-error-sensitive approach is recommended when analyzing data from validation studies of dietary reporting accuracy for energy and macronutrients.

  11. Analytical mass spectrometry. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  12. Multiple electromechanically-induced-transparency windows and Fano resonances in hybrid nano-electro-optomechanics

    Science.gov (United States)

    Ullah, Kamran; Jing, Hui; Saif, Farhan

    2018-03-01

    We show multiple electromechanically-induced transparency (EMIT) windows in a hybrid nano-electro-optomechanical system in the presence of two-level atoms coupled to a single-mode cavity field. The multiple EMIT-window profile can be observed by controlling the atom field coupling as well as Coulomb coupling between the two charged mechanical resonators. We derive the analytical expression of the multiple-EMIT-windows profile and describe the splitting of multiple EMIT windows as a function of optomechanical coupling, atom-field coupling, and Coulomb coupling. In particular, we discuss the robustness of the system against the cavity decay rate. We compare the results of identical mechanical resonators to different mechanical resonators. We further show how the hybrid nano-electro-optomechanics coupled system can lead to the splitting of the multiple Fano resonances (MFR). The Fano resonances are very sensitive to decay terms in such systems, i.e., atoms, cavities, and the mechanical resonators.

  13. Advanced qualification of pharmaceutical excipient suppliers by multiple analytics and multivariate analysis combined.

    Science.gov (United States)

    Hertrampf, A; Müller, H; Menezes, J C; Herdling, T

    2015-11-10

    Pharmaceutical excipients have different functions within a drug formulation, consequently they can influence the manufacturability and/or performance of medicinal products. Therefore, critical to quality attributes should be kept constant. Sometimes it may be necessary to qualify a second supplier, but its product will not be completely equal to the first supplier product. To minimize risks of not detecting small non-similarities between suppliers and to detect lot-to-lot variability for each supplier, multivariate data analysis (MVA) can be used as a more powerful alternative to classical quality control that uses one-parameter-at-a-time monitoring. Such approach is capable of supporting the requirements of a new guideline by the European Parliament and Council (2015/C-95/02) demanding appropriate quality control strategies for excipients based on their criticality and supplier risks in ensuring quality, safety and function. This study compares calcium hydrogen phosphate from two suppliers. It can be assumed that both suppliers use different manufacturing processes. Therefore, possible chemical and physical differences were investigated by using Raman spectroscopy, laser diffraction and X-ray powder diffraction. Afterwards MVA was used to extract relevant information from each analytical technique. Both CaHPO4 could be discriminated by their supplier. The gained knowledge allowed to specify an enhanced strategy for second supplier qualification. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Langevin equations with multiplicative noise: application to domain growth

    International Nuclear Information System (INIS)

    Sancho, J.M.; Hernandez-Machado, A.; Ramirez-Piscina, L.; Lacasta, A.M.

    1993-01-01

    Langevin equations of Ginzburg-Landau form with multiplicative noise, are proposed to study the effects of fluctuations in domain growth. These equations are derived from a coarse-grained methodology. The Cahn-Hilliard-Cook linear stability analysis predicts some effects in the transitory regime. We also derive numerical algorithms for the computer simulation of these equations. The numerical results corroborate the analytical productions of the linear analysis. We also present simulation results for spinodal decomposition at large times. (author). 28 refs, 2 figs

  15. Is a Nuclear Deal with Iran Possible? An Analytical Framework for the Iran Nuclear Negotiations

    OpenAIRE

    Sebenius, James Kimble; Singh, Michael K.

    2012-01-01

    Varied diplomatic approaches by multiple negotiators over several years have failed to conclude a nuclear deal with Iran. Mutual hostility, misperception, and flawed diplomacy may be responsible. Yet, more fundamentally, no mutually acceptable deal may exist. To assess this possibility, a "negotiation analytic" framework conceptually disentangles two issues: 1) whether a feasible deal exists and 2) how to design the most promising process to achieve one. Focusing on whether a "zone of possibl...

  16. Analytic American Option Pricing and Applications

    NARCIS (Netherlands)

    Sbuelz, A.

    2003-01-01

    I use a convenient value breakdown in order to obtain analytic solutions for finitematurity American option prices.Such a barrier-option-based breakdown yields an analytic lower bound for the American option price, which is as price-tight as the Barone-Adesi and Whaley (1987) analytic value proxy

  17. 7 CFR 94.103 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  18. 7 CFR 94.303 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...

  19. Visual Analytics of Complex Genomics Data to Guide Effective Treatment Decisions

    Directory of Open Access Journals (Sweden)

    Quang Vinh Nguyen

    2016-09-01

    Full Text Available In cancer biology, genomics represents a big data problem that needs accurate visual data processing and analytics. The human genome is very complex with thousands of genes that contain the information about the individual patients and the biological mechanisms of their disease. Therefore, when building a framework for personalised treatment, the complexity of the genome must be captured in meaningful and actionable ways. This paper presents a novel visual analytics framework that enables effective analysis of large and complex genomics data. By providing interactive visualisations from the overview of the entire patient cohort to the detail view of individual genes, our work potentially guides effective treatment decisions for childhood cancer patients. The framework consists of multiple components enabling the complete analytics supporting personalised medicines, including similarity space construction, automated analysis, visualisation, gene-to-gene comparison and user-centric interaction and exploration based on feature selection. In addition to the traditional way to visualise data, we utilise the Unity3D platform for developing a smooth and interactive visual presentation of the information. This aims to provide better rendering, image quality, ergonomics and user experience to non-specialists or young users who are familiar with 3D gaming environments and interfaces. We illustrate the effectiveness of our approach through case studies with datasets from childhood cancers, B-cell Acute Lymphoblastic Leukaemia (ALL and Rhabdomyosarcoma (RMS patients, on how to guide the effective treatment decision in the cohort.

  20. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.