WorldWideScience

Sample records for analytical tool development

  1. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  2. Social Data Analytics Tool (SODATO)

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the Social Data Analytics Tool (SODATO) that is designed, developed and evaluated to collect, store, analyze, and report big social data emanating from the social media engagement of and social media conversations about organizations.......This paper presents the Social Data Analytics Tool (SODATO) that is designed, developed and evaluated to collect, store, analyze, and report big social data emanating from the social media engagement of and social media conversations about organizations....

  3. Development of a machine tool selection system using analytic hierarchy process

    OpenAIRE

    Çimren, Emrah; Cimren, Emrah; Çatay, Bülent; Catay, Bulent; Budak, Erhan

    2004-01-01

    The selection of appropriate machines is one of the most critical decisions in the design and development of an efficient production environment. In this study, we propose a decision support system for machine tool selection using an effective algorithm, the analytic hierarchy process. In the selection process, we first consider qualitative decision criteria that are related to the machine properties. Reliability and precision analyses may be included in the detailed evaluation procedure. Fur...

  4. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  5. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  6. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  7. On the Design, Development and Use of the Social Data Analytics Tool (SODATO)

    DEFF Research Database (Denmark)

    Hussain, Abid

    consists of the communicative and linguistic aspects of the social media interaction such as the topics discussed, keywords mentioned, pronouns used and sentiments expressed. The conceptual model of social data is then used to specify the formal model of social data using the mathematics of set theory. The......This PhD is about the design, development and evaluation of the Social Data Analytics Tool (SODATO) to collect, store, analyze, and report big social data emanating from the social media engagement of and social media conversations about organizations. Situated with in the academic domains of Data...... Science, Computational Social Science and Information Systems, the PhD project addressed two general research questions about the technological architectures and design principles for big social data analytics in an organisational context. The PhD project is grounded in the theory of socio...

  8. Analytical Hierarchy Process for Developing a Building Performance-Risk Rating Tool

    Directory of Open Access Journals (Sweden)

    Khalil Natasha

    2016-01-01

    Full Text Available The need to optimize the performance of buildings has increased consequently due to the expansive supply of facilities in higher education building (HEB. Proper performance assessment as a proactive measure may help university building in achieving performance optimization. However, the current maintenance programs or performance evaluation in the HEB is a systemic and cyclic process where maintenance is considered as an operational issue and not as opposed to a strategic issue. Hence, this paper proposed a Building Performance Risk Rating Tool (BPRT as an improved measure for building performance evaluation by addressing the users' risk in health and safety aspects. The BPRT is developed from the result of a rating index using the Analytical Hierarchy Process (AHP method. 12 facilities management (FM experts and practitioners were involved in the rating process. The subjective weightings are analysed using the AHP computer software, the Expert Choice 11. The establishment of the BPRT was introduced as an aid of improvement towards the current performance assessment of HEB by emerging the concept of building performance and risk into a numerical strategic approach

  9. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    Science.gov (United States)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  10. Use of analytic hierarchy process (AHP) as an instrument to develop a solid waste management assessment tool

    OpenAIRE

    Batagarawa, Rabia; Williams, John Barry; Potts, Jonathan Stephenson; Brown, Julia Catherine

    2015-01-01

    The aim of this paper is to evaluate the feasibility of Analytic Hierarchy Process (AHP) as a data collection instrument in developing a solid waste management assessment tool. AHP is a quantifying tool that provides an effective and precise means of choosing options evident in many disciplines such as waste management where priority scales measure elements in relative terms. The procedure is performed using Expert Choice software. A structured questionnaire survey was employed to obtain data...

  11. Development of integrated analytical tools for level-2 PSA of LMFBR

    International Nuclear Information System (INIS)

    As same as to light water reactor, JNES (Japan Nuclear Energy Safety Organization) has devoted to prepare the analysis tools for PSA to liquid-metal cooled fast breeder reactor (LMFBR) to make safety evaluation from regulatory side. The developed tools consist of a group of safety analysis computer codes and an analysis method called PRD (Phenomenological Relationship Diagram) to qualify logically the probability distribution at the branching points in event trees. So far the tools have been used to evaluate the effectiveness of accident management measures of Monju proposed by the owner and the tools are under further development to describe the event progresses more realistically. One of the objectives of this improvement is to construct data bases of the Emergency Response Support System (ERSS) for Monju by conducting many application analyses to the conceivable scenarios after initiating events. The present paper introduces the function of each tool in the synthetic analysis system coupled with the accident scenario and presents points for future improvement. The phase transitions of severe accidents of LMFBR and the role of each analysis tool is shown. In (i) the plant response phase, the temperature of sodium in the primary cooling system begins to rise due to the power to flow mismatch. In cases of gradual temperature increase such as PLOHS (protected loss-of-heat sink), the sodium boundary will fail by the high temperature creep. If boundary failure does not occur,the sodium will lastly boil. The temperature and the pressure changes during the plant response phase are analyzed by the NALAP-II code. NALAP-II also calculates the SCDF (structural cumulative defect factor), that is an index of high temperate creep, of the key locations in the plant, however, the application is limited to the parts whose geometry are modeled by a cylindrical wall. Hence, for the analysis of components with complicated shape that require the consideration of buckling, structure

  12. Researching ICT-Based Enterprise in Developing Countries: Analytical Tools and Models

    OpenAIRE

    Heeks, R.

    2008-01-01

    This paper provides a guide for those researching ICT-based enterprises in developing countries. Examples of such enterprises would include telecentres, cybercafés, mobile phone shops, Internet service providers, software companies, IT training firms, IT consultancies, hardware assemblers, data entry operators, and so forth. This may also be called the IT or ICT sector, or the digital or knowledge economy.The paper offers a series of "lenses" – i.e. analytical frameworks – through which to in...

  13. Cereals for developing gluten-free products and analytical tools for gluten detection

    OpenAIRE

    Cristina M. Rosell; Barro Losada, Francisco; C. Sousa; Mena, M.C.

    2014-01-01

    Recently, gluten free foods have attracted much research interest motivated by the increasing market. Despite the motivation for developing gluten-free foods it is necessary to have a scientific basis for developing gluten-free foods and the tools for detecting the peptide sequence that could be immune-toxic to some persons. This review will be focused primarily on the cereal-based commodities available for developing gluten free blends, considering naturally gluten-free cereals in addition t...

  14. Development of Integrated Analytical Tools for Level-2 PSA of LMFBR

    International Nuclear Information System (INIS)

    JNES has developed own safety analysis methods for LMFBR to make safety analyses independently from the applicant to support the regulatory body. The area of these computer codes covers the plant response phase, the core disruption phase and the containment vessel response phase of severe accidents. In addition to the codes, the PRD (Phenomenological Relationship Diagram) method was figured out as a logical method to identify the probability distributions of blanching points in event trees for level-2 PSA. After validation of these codes using various experimental data and many trial calculations to actual reactor system, the prepared tools were applied to the level-2 PSA of Monju to evaluate the effectiveness of accident management measures of Monju. (author)

  15. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    Science.gov (United States)

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  16. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  17. Analytic tools for information warfare

    Energy Technology Data Exchange (ETDEWEB)

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  18. Internet promotion tools and techniques: analytical review

    Directory of Open Access Journals (Sweden)

    S.M. Illiashenko

    2015-09-01

    Full Text Available The aim of the article. The aim of the article is an analysis and systematization of modern communication Internet marketing tools, development of recommendations for their management to promote products in a virtual environment and maintaining the highest level of communication with their economic partners and contact groups. The results of the analysis. The systematic analysis and systematization of the known Internet marketing tools were made. Authors divide them into 8 categories of the use functionality: Search Engine Marketing, Internet advertising, Social Relationship Marketing, Viral Marketing, Video Marketing, E-mail Marketing, Innovative Marketing and Analytical Marketing. The recommendations for this tools use by various size companies were proposed and the most popular Internet-instruments for products promotion were noted. By the results of analysis, the communication instruments of Internet-marketing are divided into 4 groups, which are closely interrelated. Their complex use leads to synergistic effect that appears at profit growth, consumer interest and creating of company’s positive image. Today the forgotten method of communication – E-mail Marketing, interactive infographics, communications in the form of stories, Marketing in social networks and Analytical Marketing have acquired unexpected development. These instruments satisfy needs of companies (the possibility of solid presentation, active communication link and its precise measurements and consumers (interesting content, supported by visual image and information on request. Conclusions and directions for future research. The results can be used as methodological assistance in choosing rational sets of Internet marketing instruments that would take into account the specificity of a production company (seller and its products, market, target audience. The future research must be directed to detection of inexpensive but effective Internet-communication tools, detection

  19. Aptamers: molecular tools for analytical applications.

    Science.gov (United States)

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review. PMID:17581746

  20. Tool Capability in Visual EAM Analytics

    Directory of Open Access Journals (Sweden)

    Dierk Jugel

    2015-04-01

    Full Text Available Enterprise Architectures (EA consist of a multitude of architecture elements, which relate in manifold ways to each other. As the change of a single element hence impacts various other elements, mechanisms for architecture analysis are important to stakeholders. The high number of relationships aggravates architecture analysis and makes it a complex yet important task. In practice EAs are often analyzed using visualizations. This article contributes to the field of visual analytics in enterprise architecture management (EAM by reviewing how state-of-the-art software platforms in EAM support stakeholders with respect to providing and visualizing the “right” information for decision-making tasks. We investigate the collaborative decision-making process in an experiment with master students using professional EAM tools by developing a research study. We evaluate the students’ findings by comparing them with the experience of an enterprise architect.

  1. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  2. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  3. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  4. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  5. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  7. ‘Slag_Fun’ – A New Tool for Archaeometallurgy: Development of an Analytical (PED-XRF Method for Iron-Rich Materials

    Directory of Open Access Journals (Sweden)

    Harald Alexander Veldhuijzen

    2003-11-01

    Full Text Available This paper describes the development of a new analytical tool for bulk chemical analysis of iron-rich archaeometallurgical remains by Polarising Energy Dispersive X-ray Fluorescence ((PED-XRF. Prompted by the ongoing archaeological and archaeometric analyses of early first millennium BC iron smelting and smithing finds from Tell Hammeh (az-Zarqa, Jordan, the creation of this tool has already benefited several studies on iron-rich slag, of widely varying provenance as well as age (Anguilano 2002; Chirikure 2002; Ige and Rehren 2003; Stanway 2003. Following an explanation of the archaeological background and importance of the Hammeh finds, the paper describes the technical foundations of XRF analysis and the design, development and application of the "slag_fun" calibration method.

  8. DSAT: Data Storage and Analytics Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The aim of this project is the development a large data warehousing and analysis tool for air traffic management (ATM) research that can be accessed by users...

  9. Analytical and Decision Support Tools for Genomics-Assisted Breeding

    OpenAIRE

    Varshney, Rajeev K; Singh, Vikas K; Hickey, John M; Xun, Xu; Marshall, David F.; Wang, Jun; Edwards, David; Ribaut, Jean-Marcel

    2016-01-01

    To successfully implement genomics-assisted breeding (GAB) in crop improvement programs, efficient and effective analytical and decision support tools (ADSTs) are 'must haves' to evaluate and select plants for developing next-generation crops. Here we review the applications and deployment of appropriate ADSTs for GAB, in the context of next-generation sequencing (NGS), an emerging source of massive genomic information. We discuss suitable software tools and pipelines for marker-based approac...

  10. Internet promotion tools and techniques: analytical review

    OpenAIRE

    S.M. Illiashenko; T.Ye. Ivanova

    2015-01-01

    The aim of the article. The aim of the article is an analysis and systematization of modern communication Internet marketing tools, development of recommendations for their management to promote products in a virtual environment and maintaining the highest level of communication with their economic partners and contact groups. The results of the analysis. The systematic analysis and systematization of the known Internet marketing tools were made. Authors divide them into 8 categories of th...

  11. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  12. Analytical tool development for coarse break-up of a molten jet in a deep water pool

    Energy Technology Data Exchange (ETDEWEB)

    Moriyama, Kiyofumi [Thermohydraulic Safety Research Group, Japan Atomic Energy Agency (JAEA), 2-4 Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan)]. E-mail: moriyama.kiyofumi@jaea.go.jp; Nakamura, Hideo [Thermohydraulic Safety Research Group, Japan Atomic Energy Agency (JAEA), 2-4 Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan); Maruyama, Yu [Thermohydraulic Safety Research Group, Japan Atomic Energy Agency (JAEA), 2-4 Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan)

    2006-10-15

    A computer code JASMINE-pre was developed for the prediction of premixing conditions of fuel-coolant interactions and debris bed formation behavior relevant to severe accidents of light water reactors. In JASMINE-pre code, a melt model which consists of three components of sub-models for melt jet, melt particles and melt pool, is coupled with a two-phase flow model derived from ACE-3D code developed at JAERI. The melt jet and melt pool models are one-dimensional representations of a molten core stream falling into a water pool and a continuous melt body agglomerated on the bottom, respectively. The melt particles generated by the melt jet break-up are modeled based on a Lagrangian grouped particle concept. Additionally, a simplified model pmjet was developed which considers only steady state break-up of the melt jet, cooling and settlement of particles in a stationary water pool. The FARO corium quenching experiments with a saturation temperature water pool and a subcooled water pool were simulated with JASMINE-pre and pmjet. JASMINE-pre reproduced the pressurization and fragmentation behavior observed in the experiments with a reasonable accuracy. Also, the influences of model parameters on the pressurization and fragmentation were examined. The calculation results showed a quasi-steady state phase of melt jet break-up during which the amount of molten mass contained in the premixture was kept almost constant, and the steady state molten premixed masses evaluated by JASMINE-pre and pmjet agreed well.

  13. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  14. Landscape History and Theory: from Subject Matter to Analytic Tool

    OpenAIRE

    Birksted, Jan Kenneth

    2004-01-01

    This essay explores how landscape history can engage methodologically with the adjacent disciplines of art history and visual/cultural studies. Central to the methodological problem is the mapping of the beholder � spatially, temporally and phenomenologically. In this mapping process, landscape history is transformed from subject matter to analytical tool. As a result, landscape history no longer simply imports and applies ideas from other disciplines but develops its own metho...

  15. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Hendrik, Drachsler; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  16. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    OpenAIRE

    Mihaela-Laura IVAN; Mircea Raducu TRIFU; Manole VELICANU; Cristian CIUREA

    2016-01-01

    The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In deta...

  17. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  18. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  19. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  20. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  1. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Directory of Open Access Journals (Sweden)

    Mihaela-Laura IVAN

    2016-05-01

    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.

  2. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  3. Analytical tools for speciation in the field of toxicology

    International Nuclear Information System (INIS)

    The knowledge of the speciation of elements at trace and ultra-trace level, in biological and environmental media is essential to acquire a better understanding of the mechanisms of toxicity, transport and accumulation in which they are involved. Determining the speciation of an element in a given medium is challenging and requires the knowledge of different methodological approaches: the calculation approach and the experimental approach through the use of dedicated analytical and spectroscopic tools. In this framework, this mini-review reports the approaches to investigate the speciation of elements in biological and environmental media as well as the experimental techniques of speciation analysis, illustrated by recent examples. The main analytical and spectroscopic techniques to obtain structural, molecular, elemental and isotopic information are described. A brief overview of separation techniques coupled with spectrometric techniques is given. Imaging and micro-localisation techniques, which aim at determining the in situ spatial distribution of elements and molecules in various solid samples, are also presented. The last part deals with the development of micro-analytical systems, since they open crucial perspectives to speciation analysis for low sample amounts and analysis on field. (orig.)

  4. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  5. FUMAC-84. A hybrid PCI analytical tool

    International Nuclear Information System (INIS)

    ''FUMAC-84'', a new computer code currently under development at Babcock and Wilcox, will be used to analyze PCMI in light water reactor fuel rods. This is a hybrid code in the sense that the pellet behaviour is predicted from deterministic models which incorporate the large data base being generated by the international fuel performance programs (OVERRAMP, SUPER-RAMP, NFIR, etc.), while the cladding is modelled using finite elements. The fuel cracking and relocation model developed for FUMAC is semi-empirical and includes data up to 35 GWd/mtU and linear heat rates ranging from 100 to 700 W/Cm. With this model the onset of cladding ridging has been accurately predicted for steady-state operation. Transient behaviour of the pellet is still under investigation and the model is being enhanced to include these effects. The cladding model integrates the mechanical damage over a power history by solving the finite element assumed displacement problem in a quasistatic manner. Early work on FUMAC-84 has been directed at the development and benchmarking of the interim code. The purpose of the interim code is to provide a vehicle to proof out the deterministic pellet models which have been developed. To date the cracking model and the relocation model have been benchmarked. The thermal model for the pellet was developed by fitting data from several Halden experiments. The ability to accurately predict cladding ridging behaviour has been used to test how well the pellet swelling, densification and compliance models work in conjunction with fuel cladding material models. Reasonable results have been achieved for the steady-state cases while difficulty has been encountered in trying to reproduce transient results. Current work includes an effort to improve the ability of the models to handle transients well. (author)

  6. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  7. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    Science.gov (United States)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  8. Ultrafast 2D NMR: An Emerging Tool in Analytical Spectroscopy

    Science.gov (United States)

    Giraudeau, Patrick; Frydman, Lucio

    2014-06-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry—from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications.

  9. Performance Budgeting and Accrual Budgeting: Decision rules or Analytic Tools?

    OpenAIRE

    Allen Schick

    2007-01-01

    Performance budgeting and accrual budgeting are analytic tools that provide information and insights which are not available through conventional approaches. But neither innovation is ready for widespread application as a decision rule in the budget process. This article urges fuller understanding of these innovations and their implications, and more systematic use of performance and accrual information for policy makers

  10. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  11. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    Science.gov (United States)

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  12. Learning analytics: drivers, developments and challenges

    OpenAIRE

    Ferguson, Rebecca

    2012-01-01

    Learning analytics is a significant area of technology-enhanced learning that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence ...

  13. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  14. Visual-Analytics Tools for Analyzing Polymer Conformational Dynamics

    Science.gov (United States)

    Thakur, Sidharth; Tallury, Syamal; Pasquinelli, Melissa

    2010-03-01

    The goal of this work is to supplement existing methods for analyzing spatial-temporal dynamics of polymer conformations derived from molecular dynamics simulations by adapting standard visual-analytics tools. We intend to use these tools to quantify conformational dynamics and chemical characteristics at interfacial domains, and correlate this information to the macroscopic properties of a material. Our approach employs numerical measures of similarities and provides matrix- and graph-based representations of the similarity relationships for the polymer structures. We will discuss some numerical measures that encapsulate geometric and spatial attributes of polymer molecular configurations. These methods supply information on global and local relationships between polymer conformations, which can be used to inspect important characteristics of stable and persistent polymer conformations in specific environments. Initially, we have applied these tools to investigate the interface in polymer nanocomposites between a polymer matrix and carbon nanotube reinforcements and to correlate this information to the macroscopic properties of the material. The results indicate that our visual-analytic approach can be used to compare spatial dynamics of rigid and non-rigid polymers and properties of families of related polymers.

  15. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom; Keller, Stephan Sylvest; Sommer-Larsen, Peter; Boisen, Anja; Almdal, Kristoffer

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated. The...... polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus of...... the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA...

  16. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  17. TNO monitoring plan development tool

    NARCIS (Netherlands)

    Sijacic, D.; Wildenborg, T.; Steeghs, P.

    2014-01-01

    TNO has developed a software tool that supports the design of a risk-based monitoring plan for a CO2 storage site. The purpose of the tool is to aid storage site operators by facilitating a structured monitoring technologies selection or evaluation process. The tool makes a selection this recommende

  18. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  19. Environmental tools in product development

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Hauschild, Michael Zwicky; Jørgensen, Jørgen; Alting, Leo

    A precondition for design of environmentally friendly products is that the design team has access to methods and tools supporting the introduction of environmental criteria in product development. A large Danish program, EDIP, is being carried out by the Institute for Product Development, Technical...... University of Denmark, in cooperation with 5 major Danish companies aiming at the development and testing of such tools. These tools are presented in this paper...

  20. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  1. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development of...... cancer, improve patients’ chances of recovery, and limit the physical, psychological and social side-effects of cancer. The website is the main channel for communication and knowledge sharing with patients, their relatives and professionals. The present study consists of two independent analyses, one...... using Google analytics focusing on searching and browsing activities, another using a home-grown transaction log developed to collect data about tagging, searching and browsing by tags. The log is set up to distinguish between tags added by editors and end-users respectively. Altogether, the study...

  2. Development of Nuclear Analytical Technology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Joon; Kim, J. Y.; Sohn, S. C. (and others)

    2007-06-15

    The pre-treatment and handling techniques for the micro-particles in swipe samples were developed for the safeguards purpose. The development of screening technique for the swipe samples has been established using the nuclear fission track method as well as the alpha track method. The laser ablation system to take a nuclear particle present in swipe was designed and constructed for the determination of the enrichment factors for uranium or plutonium, and its performance was tested in atmosphere as well as in vacuum. The optimum conditions for the synthesis of silica based micro-particles were obtained for mass production. The optimum ion exchange resin was selected and the optimum conditions for the uranium adsorption in resin bead technique were established for the development of the enrichment factor for nuclear particles in swipe. The established technique was applied to the swipe taken directly from the nuclear facility and also to the archive samples of IAEA's environmental swipes. The evaluation of dose rate of neutron and secondary gamma-ray for the radiation shields were carried out to design the NIPS system, as well as the evaluation of the thermal neutron concentration effect by the various reflectors. D-D neutron generator was introduced as a neutron source for the NIPS system to have more advantages such as easier control and moderation capability than the {sup 252}Cf source. Simulated samples for explosive and chemical warfare were prepared to construct a prompt gamma-ray database. Based on the constructed database, a computer program for the detection of illicit chemical and nuclear materials was developed using the MATLAB software.

  3. New analytical tools combining gel electrophoresis and mass spectrometry

    OpenAIRE

    Tobolkina, Elena

    2014-01-01

    Proteomics has been one of the main projects challenging biological and analytical chemists for many years. The separation, identification and quantification of all the proteins expressed within biological systems remain the main objectives of proteomics. Due to sample complexity, the development of fractionation, separation, purification and detection techniques that possess appropriate resolution to separate a large number of proteins, as well as being sensitive and fast enough for high thr...

  4. Analytical tools for monitoring and control of fermentation processes

    OpenAIRE

    Sundström, Heléne

    2007-01-01

    The overall objective of this work has been to adopt new developments and techniques in the area of measurement, modelling and control of fermentation processes. Flow cytometry and software sensors are techniques which were considered ready for application and the focus was set on developing tools for research aiming at understanding the relationship between measured variables and process quality parameters. In this study fed-batch cultivations have been performed with two different strains o...

  5. Android development tools for Eclipse

    CERN Document Server

    Shah, Sanjay

    2013-01-01

    A standard tutorial aimed at developing Android applications in a practical manner.Android Development Tools for Eclipse is aimed at beginners and existing developers who want to learn more about Android development. It is assumed that you have experience in Java programming and that you have used IDE for development.

  6. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  7. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    Science.gov (United States)

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  8. Foresight, Competitive Intelligence and Business AnalyticsTools for Making Industrial Programmes More Efficient

    OpenAIRE

    Jonathan, Calof; Gregory, Richards; Jack, Smith

    2015-01-01

    Creating industrial programmes, especially in technology, is fraught with high levels of uncertainty. These programmes target the development of products that will not be sold for several years; therefore, one of the risks is that the products will no longer be in demand due to the emergence of more advanced technologies. The paper proposes an integrated approach involving the complementary functions of foresight, intelligence and business analytics. The tools of foresight and intelligence ar...

  9. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  10. Urban Development Tools in Denmark

    DEFF Research Database (Denmark)

    Aunsborg, Christian; Enemark, Stig; Sørensen, Michael Tophøj

    2005-01-01

    Artiklen indeholder følgende afsnit: 1. Urbax and the Danish Planning system 2. Main Challenges in the Urban Development 3. Coordination and Growth (Management) Policies and Spatial Planning Policies 4. Coordination of Market Events and Spatial Planning 5. The application of Urban Development Tools...

  11. Developing ICALL Tools Using GATE

    Science.gov (United States)

    Wood, Peter

    2008-01-01

    This article discusses the use of the General Architecture for Text Engineering (GATE) as a tool for the development of ICALL and NLP applications. It outlines a paradigm shift in software development, which is mainly influenced by projects such as the Free Software Foundation. It looks at standards that have been proposed to facilitate the…

  12. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  13. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  14. MODEL STRUCTURE DEVELOPMENT FOR TOOL REFERENCE AND ANLYTICAL GIS

    OpenAIRE

    Pisarev, V.

    2011-01-01

    Tool reference and analytical GIS is to meet public demand in systematized, target-selected information of long-term significance. This information volume should not be limited presenting some time sample of a specified subject field state. In general the information is rather homogeneous as concerns its reliability, accuracy and adequacy, as it was professionally analyzed in the process of the system development. The uniform information content of the tool reference and analytical GIS integr...

  15. Developing variations: : An analytical and historical perspective

    OpenAIRE

    Sirman, Berk

    2006-01-01

    ABSTRACT Berk Sirman: Developing Variations – An Analytical and Historical Perspective. Uppsala Universitet: Institutionen för musikvetenskap, uppsats för 60 p., 2006. Developing variations is a term by Arnold Schönberg that is coined to describe constant modification of motives and ideas in a theme, or possibly throughout the whole work. This is thought to be superior to exact repetitions. Developing variations was used by Schönberg to analyze the music of Brahms, whose compositions represen...

  16. Employability Skills Assessment Tool Development

    Science.gov (United States)

    Rasul, Mohamad Sattar; Rauf, Rose Amnah Abd; Mansor, Azlin Norhaini; Puvanasvaran, A. P.

    2012-01-01

    Research nationally and internationally found that technical graduates are lacking in employability skills. As employability skills are crucial in outcome-based education, the main goal of this research is to develop an Employability Skill Assessment Tool to help students and lecturers produce competent graduates in employability skills needed by…

  17. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  18. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  19. RainTools Software Development

    OpenAIRE

    Van Luijtelaar, Dirk Jan

    2015-01-01

    The aim of this Bachelor’s thesis was to develop the RainTools software pack-age for the customer, Stichting RIONED, and learn about the process of soft-ware development. The main aim of this Bachelor’s thesis was to learn the ca-pabilities and possibilities of C# in combination with WPF and XAML as op-posed to regular WinForms. To achieve this brainstorm began to develop a user interface for the customer to translate input data to XML, feed it to a third party application, read the re-su...

  20. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  1. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    International Nuclear Information System (INIS)

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  2. Analytic Tools for Evaluating Variability of Standard Errors in Large-Scale Establishment Surveys

    Directory of Open Access Journals (Sweden)

    Cho MoonJung

    2014-12-01

    Full Text Available Large-scale establishment surveys often exhibit substantial temporal or cross-sectional variability in their published standard errors. This article uses a framework defined by survey generalized variance functions to develop three sets of analytic tools for the evaluation of these patterns of variability. These tools are for (1 identification of predictor variables that explain some of the observed temporal and cross-sectional variability in published standard errors; (2 evaluation of the proportion of variability attributable to the abovementioned predictors, equation error and estimation error, respectively; and (3 comparison of equation error variances across groups defined by observable predictor variables. The primary ideas are motivated and illustrated by an application to the U.S. Current Employment Statistics program.

  3. Social Robustness as Analytical Tool or Normative Standard?

    OpenAIRE

    Hansen, Janus

    2010-01-01

    A recent issue of STI-Studies (vol. 5, no. 2) contained two articles, which both addressed the so-called ‘Mode 2-diagnosis’ by Nowotny et al. (2001). In particular, they both made reference to the affiliated concept of ‘social robustness’. Given this topical overlap, the editors of STI-Studies encouraged the authors of the two articles to provide comments on each other’s paper. My own paper (Hansen 2009) is concerned primarily with the theoretical consistency and analytical value of the conce...

  4. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    OpenAIRE

    Šebjan Urban; Bobek Samo; Tominc Polona

    2016-01-01

    Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS) that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology) as critical organizational factors affect the attitude towards the use of the analytic...

  5. Application of quantum dots as analytical tools in automated chemical analysis: A review

    Energy Technology Data Exchange (ETDEWEB)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, Joao A.C.; Prior, Joao A.V.; Marques, Karine L. [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer Review on quantum dots application in automated chemical analysis. Black-Right-Pointing-Pointer Automation by using flow-based techniques. Black-Right-Pointing-Pointer Quantum dots in liquid chromatography and capillary electrophoresis. Black-Right-Pointing-Pointer Detection by fluorescence and chemiluminescence. Black-Right-Pointing-Pointer Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  6. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  7. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  8. Social media analytics: a survey of techniques, tools and platforms

    OpenAIRE

    Batrinca, B.; Treleaven, P. C.

    2015-01-01

    This paper is written for (social science) researchers seeking to analyze the wealth of social media now available. It presents a comprehensive review of software tools for social networking media, wikis, really simple syndication feeds, blogs, newsgroups, chat and news feeds. For completeness, it also includes introductions to social media scraping, storage, data cleaning and sentiment analysis. Although principally a review, the paper also provides a methodology and a critique of social med...

  9. Information and Analytic Maintenance of Nanoindustry Development

    Directory of Open Access Journals (Sweden)

    Glushchenko Aleksandra Vasilyevna

    2015-05-01

    Full Text Available The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters of nanotech industry development. On the basis of the conducted research the need of information system of nanotech industry development is proved. The information interrelations of subjects of nanotech industry for development of communicative function of the account which becomes dominating in comparison with control function are revealed. The information needs of users of financial and non-financial information are defined. The stages of its introduction, since determination of character, volume, the list and degree of efficiency of information before creation of system of the administrative reporting, the analysis and control are in detail registered. The information and analytical system is focused on the general assessment of efficiency and the major economic indicators, the general tendencies of development of nanotech industry, possible reserves of increasing the efficiency of their functioning. The authors develop pthe system of the indicators characterizing the advancement of nanotech industry and allowing to estimate innovative activity in the sphere of nanotech industry, to calculate intensity of nano-innovations costs, to define the productivity and efficiency of nanotech industry in branch, the region, national economy in general.

  10. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Söletormos, Georg

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  11. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen;

    , improved performance profiling tools and assimilation of results to academic and industrial partners in our network. Our approaches calls for multi-disciplinary skills and understanding of hardware, software development, profiling tools and tuning techniques, analytical methods for analysis and development...... (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  12. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    Directory of Open Access Journals (Sweden)

    Shane Dawson

    2014-09-01

    Full Text Available The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006 and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.

  13. Rhodobase, a meta-analytical tool for reconstructing gene regulatory networks in a model photosynthetic bacterium.

    Science.gov (United States)

    Moskvin, Oleg V; Bolotin, Dmitry; Wang, Andrew; Ivanov, Pavel S; Gomelsky, Mark

    2011-02-01

    We present Rhodobase, a web-based meta-analytical tool for analysis of transcriptional regulation in a model anoxygenic photosynthetic bacterium, Rhodobacter sphaeroides. The gene association meta-analysis is based on the pooled data from 100 of R. sphaeroides whole-genome DNA microarrays. Gene-centric regulatory networks were visualized using the StarNet approach (Jupiter, D.C., VanBuren, V., 2008. A visual data mining tool that facilitates reconstruction of transcription regulatory networks. PLoS ONE 3, e1717) with several modifications. We developed a means to identify and visualize operons and superoperons. We designed a framework for the cross-genome search for transcription factor binding sites that takes into account high GC-content and oligonucleotide usage profile characteristic of the R. sphaeroides genome. To facilitate reconstruction of directional relationships between co-regulated genes, we screened upstream sequences (-400 to +20bp from start codons) of all genes for putative binding sites of bacterial transcription factors using a self-optimizing search method developed here. To test performance of the meta-analysis tools and transcription factor site predictions, we reconstructed selected nodes of the R. sphaeroides transcription factor-centric regulatory matrix. The test revealed regulatory relationships that correlate well with the experimentally derived data. The database of transcriptional profile correlations, the network visualization engine and the optimized search engine for transcription factor binding sites analysis are available at http://rhodobase.org. PMID:21070832

  14. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  15. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  16. An analytic model for tool trajectory error in 5-axis machining

    Directory of Open Access Journals (Sweden)

    B.S. So

    2008-12-01

    Full Text Available Purpose: This paper proposes an analytical method of evaluating the maximum error by modeling the exact toolpath when the tool traverses singular region in five-axis machining.Design/methodology/approach: It is known that the Numerical Control (NC data obtained from the inversekinematic transformation can generate singular positions, which have incoherent movements on the rotary axes.Such movements cause unexpected errors and abrupt operations, resulting in scoring on the machined surface.To resolve this problem, previous methods have calculated several tool positions during a singular operation,using inverse kinematic equations to predict tool trajectory and approximate the maximum error. This type ofnumerical approach, configuring the tool trajectory, requires a lot of computational time to obtain a sufficientnumber of tool positions in the singular region. We have derived an analytical equation for the tool trajectoryin the singular area by modeling the tool operation, by considering linear and nonlinear parts that are a generalform of the tool trajectory in the singular area and that are suitable for all types of five-axis machine tools. Inaddition, evaluation of the maximum tool-path error shows high accuracy, using our analytical model.Findings: : In this study, we have separated the linear components of the tool trajectory from the nonlinear ones,to propose a tool trajectory model that is applicable to any kind of 5-axis machine. We have also proposed amethod to calculate the maximum deviation error based on the proposed tool trajectory model.Practical implications: The algorithms proposed in this work can be used for evaluating NC data and forlinearization of NC data with singularity.Originality/value: Our algorithm can be used to modify NC data, making the operation smoother and reducingany errors within tolerance.

  17. Status of immunoassay as an analytical tool in environmental investigations

    International Nuclear Information System (INIS)

    Immunoassay methods were initially applied in clinical situations where their sensitivity and selectivity were utilized for diagnostic purposes. In the 1970s, pesticide chemists realized the potential benefits of immunoassay methods for compounds difficult to analyze by gas chromatography. This transition of the technology has extended to the analysis of soil, water, food and other matrices of environmental and human exposure significance particularly for compounds difficult to analyze by chromatographic methods. The utility of radioimmunoassays and enzyme immunoassays for environmental investigations was recognized in the 1980s by the U.S. Environmental Protection Agency (U.S. EPA) with the initiation of an immunoassay development programme. The U.S. Department of Agriculture (USDA) and the U.S. Food and Drug Administration (PDA) have investigated immunoassays for the detection of residues in food both from an inspection and a contamination prevention perspective. Environmental immunoassays are providing rapid screening information as well as quantitative information to fulfill rigorous data quality objectives for monitoring programmes

  18. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  19. The RESET tephra database and associated analytical tools

    Science.gov (United States)

    Bronk Ramsey, Christopher; Housley, Rupert A.; Lane, Christine S.; Smith, Victoria C.; Pollard, A. Mark

    2015-06-01

    An open-access database has been set up to support the research project studying the 'Response of Humans to Abrupt Environmental Transitions' (RESET). The main methodology underlying this project was to use tephra layers to tie together and synchronise the chronologies of stratigraphic records at archaeological and environmental sites. The database has information on occurrences, and chemical compositions, of glass shards from tephra and cryptotephra deposits found across Europe. The data includes both information from the RESET project itself and from the published literature. With over 12,000 major element analyses and over 3000 trace element analyses on glass shards, relevant to 80 late Quaternary eruptions, the RESET project has generated an important archive of data. When added to the published information, the database described here has a total of more than 22,000 major element analyses and nearly 4000 trace element analyses on glass from over 240 eruptions. In addition to the database and its associated data, new methods of data analysis for assessing correlations have been developed as part of the project. In particular an approach using multi-dimensional kernel density estimates to evaluate the likelihood of tephra compositions matching is described here and tested on data generated as part of the RESET project.

  20. OpenEssayist: a supply and demand learning analytics tool for drafting academic essays

    OpenAIRE

    Whitelock, Denise; Twiner, Alison; John T. E. Richardson; Field, Debora; Pulman, Stephen

    2015-01-01

    This paper focuses on the use of a natural language analytics engine to provide feedback to students when preparing an essay for summative assessment. OpenEssayist is a real-time learning analytics tool, which operates through the combination of a linguistic analysis engine that processes the text in the essay, and a web application that uses the output of the linguistic analysis engine to generate the feedback. We outline the system itself and present analysis of observed patterns of activit...

  1. Developing a Social Media Marketing tool

    OpenAIRE

    Valova, Olga

    2015-01-01

    The objective of the thesis is to develop a better, easier to use social media marketing tool that could be utilised in any business. By understanding and analysing how business uses social media as well as currently available social media marketing tools, design a tool with the maximum amount of features, but with a simple and intuitive User Interface. An agile software development life cycle was used throughout the creation of the tool. Qualitative analysis was used to analyse existing ...

  2. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  3. Development of analytical levels of accounting information

    OpenAIRE

    Кивачук, Василий Сазанович; Дружинина, Евгения Олеговна

    2015-01-01

    Retrospective analysis of accounting as a category of accounting has been carried out. The analytical levels of accounting information have been presented. Necessity of accounting system formation and intangible assets management as well as shadow economy recognition as accounting object has been grounded.

  4. Complex reconfiguration - developing common tools

    International Nuclear Information System (INIS)

    Reconfiguring DOE sites, facilities, and laboratories to meet expected and evolving missions involves a number of disciplines and approaches formerly the preserve of private industry and defense contractors. This paper considers the process of identifying common tools for the various disciplines that can be exercised, assessed, and applied by team members to arrive at integrated solutions. The basic tools include: systems, hardware, software, and procedures that can characterize a site/facility's environment to meet organizational goals, safeguards and security, ES ampersand H, and waste requirements. Other tools such as computer-driven inventory and auditing programs can provide traceability of materials and product as they are processed and required added protection and control. This paper will also discuss the use of integrated teams in a number of high technology enterprises that could be adopted by DOE in high profile programs from environmental remediation to weapons dismantling and arms control

  5. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  6. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    Science.gov (United States)

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  7. The Evolution of Analytical Hierarchy Process (AHP) as a Decision Making Tool in Property Sectors

    OpenAIRE

    Mohd Safian, Edie Ezwan; Nawawi, Abdul Hadi

    2011-01-01

    In the 1970s, Analytical Hierarchy Process (AHP)has been introduced accidentally by Saaty [4] as a tool to allocate resources and planning needs for the military. However, due to its ability to identify the weightage of variables efficiently in research, it has become popular in many sectors. Basically, AHP is a tool in decision making that arranges the variables into a hierarchical form in order to rank the importance of each variable. Leading to the weightage calculation of the variables in...

  8. MASCOT: Multi-Criteria Analytical SCOring Tool for ArcGIS Desktop

    OpenAIRE

    Pierre Lacroix; Helder Santiago; Nicolas Ray

    2014-01-01

    Multicriteria Analytical SCOring Tool (MASCOT) is a decision-support tool based on spatial analysis that can score items (points, lines, and polygons) as a function of their Euclidian distance to other data (points, lines, polygons, rasters). MASCOT is integrated with ArcGIS 9.3.1 and makes it possible to achieve a complete workflow that may include data preparation and grouping of factors by theme, weighting, scoring, post-processing and decision-making. To achieve the weighting process, the...

  9. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  10. Program Development Tools and Infrastructures

    International Nuclear Information System (INIS)

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  11. Productive re-use of CSCL data and analytic tools to provide a new perspective on group cohesion

    OpenAIRE

    Reffay, Christophe; Teplovs, Christopher; Blondel, François-Marie

    2011-01-01

    The goals of this paper are twofold: (1) to demonstrate how previously published data can be re-analyzed to gain a new perspective on CSCL dynamics and (2) to propose a new measure of social cohesion that was developed through improvements to existing analytic tools. In this study, we downloaded the Simuligne corpus from the publicly available Mulce repository. We improved the Knowledge Space Visualizer (KSV) to deepen the notion of cohesion by using a dynamic representation of sociograms. Th...

  12. The Use of Economic Analytical Tools in Quantifying and Measuring Educational Benefits and Costs.

    Science.gov (United States)

    Holleman, I. Thomas, Jr.

    The general objective of this study was to devise quantitative guidelines that school officials can accurately follow in using benefit-cost analysis, cost-effectiveness analysis, ratio analysis, and other similar economic analytical tools in their particular local situations. Specifically, the objectives were to determine guidelines for the…

  13. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    Science.gov (United States)

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  14. Analytical Investigation and Comparison on Performance of Ss316, Ss440c and Titanium Alloy Tool Materials Used As Single Point Cutting Tool

    Directory of Open Access Journals (Sweden)

    Mr. Amaresh Kumar Dhadange

    2015-08-01

    Full Text Available Theoretical analysis for performance studies of SS316, SS44OC and Titanium Alloy used as a cutting tool is presented in this paper. Tool temperature, tools wear and life of the tool is investigated analytically. These theoretical values are compared with the experimental studies conducted by the author. The values obtained from experimental studies are comparable with analytical values and variation is the correlation between theoretical and experimental values is of the order of 15%.

  15. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  16. Observation Tools for Professional Development

    Science.gov (United States)

    Malu, Kathleen F.

    2015-01-01

    Professional development of teachers, including English language teachers, empowers them to change in ways that improve teaching and learning (Gall and Acheson 2011; Murray 2010). In their seminal research on staff development--professional development in today's terms--Joyce and Showers (2002) identify key factors that promote teacher change.…

  17. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). PMID:26873463

  18. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  19. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  20. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  1. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  2. Interactive Tools to Track Child Development

    Science.gov (United States)

    ... Past Emails CDC Features Interactive Tools to Track Child Development Recommend on Facebook Tweet Share Compartir Watch for ... Do if I Suspect a Problem with My Child's Development? Talk with your child's doctor. If you or ...

  3. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-17

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  4. 40 CFR 766.16 - Developing the analytical test method.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Developing the analytical test method. 766.16 Section 766.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC... analytical test method. Because of the matrix differences of the chemicals listed for testing, no one...

  5. Development of a machine tool selection system using AHP

    OpenAIRE

    Çimren, Emrah; Cimren, Emrah; Çatay, Bülent; Catay, Bulent; Budak, Erhan

    2007-01-01

    The selection of appropriate machines is one of the most critical decisions in the design and development of an efficient production environment. In this study, we propose a decision support system for machine tool selection using an effective algorithm, the analytic hierarchy process. In the selection process, we first consider qualitative decision criteria that are related to the machine properties. Reliability and precision analyses may be included in the detailed evaluation procedure. Fur...

  6. 105-KE Basin isolation barrier leak rate test analytical development

    International Nuclear Information System (INIS)

    This report provides analytical developments in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used ti determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage, viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  7. Software Development Management: Empirical and Analytical Perspectives

    Science.gov (United States)

    Kang, Keumseok

    2011-01-01

    Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…

  8. Evaluation of angiotensin II receptor blockers for drug formulary using objective scoring analytical tool

    Directory of Open Access Journals (Sweden)

    Lim TM

    2012-09-01

    Full Text Available Drug selection methods with scores have been developed and used worldwide for formulary purposes. These tools focus on the way in which the products are differentiated from each other within the same therapeutic class. Scoring Analytical Tool (SAT is designed based on the same principle with score and is able to assist formulary committee members in evaluating drugs either to add or delete in a more structured, consistent and reproducible manner. Objective: To develop an objective SAT to facilitate evaluation of drug selection for formulary listing purposes. Methods: A cross-sectional survey was carried out. The proposed SAT was developed to evaluate the drugs according to pre-set criteria and sub-criteria that were matched to the diseases concerned and scores were then assigned based on their relative importance. The main criteria under consideration were safety, quality, cost and efficacy. All these were converted to questionnaires format. Data and information were collected through self-administered questionnaires that were distributed to medical doctors and specialists from the established public hospitals. A convenient sample of 167 doctors (specialists and non-specialists were taken from various disciplines in the outpatient clinics such as Medical, Nephrology and Cardiology units who prescribed ARBs hypertensive drugs to patients. They were given a duration of 4 weeks to answer the questionnaires at their convenience. One way ANOVA, Kruskal Wallis and post hoc comparison tests were carried out at alpha level 0.05. Results: Statistical analysis showed that the descending order of ARBs preference was Telmisartan or Irbesartan or Losartan, Valsartan or Candesartan, Olmesartan and lastly Eprosartan. The most cost saving ARBs for hypertension in public hospitals was Irbesartan. Conclusion: SAT is a tool which can be used to reduce the number of drugs and retained the most therapeutically appropriate drugs in the formulary, to determine most

  9. Technology Roadmaps: Tools for Development

    OpenAIRE

    Anthony Clayton

    2008-01-01

    The paper opens a series of two publications devoted to technological roadmapping. This technique allows revealing and relating threats, risks, priorities and opportunities of development for different technologies and thus making better decisions. Different factors influencing on road mapping are considered such as base technologies and possible alternatives, potential gaps and risks, competitiveness etc. Special attention is paid to emerging technologies having great potential, when expecte...

  10. A heterogeneous analytical benchmark for particle transport methods development

    International Nuclear Information System (INIS)

    A heterogeneous analytical benchmark has been designed to provide a quality control measure for large-scale neutral particle computational software. Assurance that particle transport methods are efficiently implemented and that current codes are adequately maintained for reactor and weapons applications is a major task facing today's transport code developers. An analytical benchmark, as used here, refers to a highly accurate evaluation of an analytical solution to the neutral particle transport equation. Because of the requirement of an analytical solution, however, only relatively limited transport scenarios can be treated. To some this may seem to be a major disadvantage of analytical benchmarks. However, to the code developer, simplicity by no means diminishes the usefulness of these benchmarks since comprehensive transport codes must perform adequately for simple as well as comprehensive transport scenarios

  11. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  12. Information and Analytic Maintenance of Nanoindustry Development

    OpenAIRE

    Glushchenko Aleksandra Vasilyevna; Bukhantsev Yuriy Alekseevich; Khudyakova Anna Sergeevna

    2015-01-01

    The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters o...

  13. An integrated analytic tool and knowledge-based system approach to aerospace electric power system control

    Science.gov (United States)

    Owens, William R.; Henderson, Eric; Gandikota, Kapal

    1986-10-01

    Future aerospace electric power systems require new control methods because of increasing power system complexity, demands for power system management, greater system size and heightened reliability requirements. To meet these requirements, a combination of electric power system analytic tools and knowledge-based systems is proposed. The continual improvement in microelectronic performance has made it possible to envision the application of sophisticated electric power system analysis tools to aerospace vehicles. These tools have been successfully used in the measurement and control of large terrestrial electric power systems. Among these tools is state estimation which has three main benefits. The estimator builds a reliable database for the system structure and states. Security assessment and contingency evaluation also require a state estimator. Finally, the estimator will, combined with modern control theory, improve power system control and stability. Bad data detection as an adjunct to state estimation identifies defective sensors and communications channels. Validated data from the analytic tools is supplied to a number of knowledge-based systems. These systems will be responsible for the control, protection, and optimization of the electric power system.

  14. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  15. Online Analytical Processing (OLAP): A Fast and Effective Data Mining Tool for Gene Expression Databases

    OpenAIRE

    Nadim W. Alkharouf; D. Curtis Jamison; Benjamin F. Matthews

    2005-01-01

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to...

  16. User Research and Game Analytics as a Combined Tool of Business Intelligence in Mobile Game Industry

    OpenAIRE

    Büyükcan, Elif

    2014-01-01

    User studies are important sources of information for companies as they help the companies to understand the customers better when reaching them in many efficient ways. On the other hand, game analytics provide a good understanding about user behavior by making use of in-game statistics. Since both tools set up a good but different knowledge base, game companies can benefit from the combination of these two related information sources. Benefits which can be achieved from this cover gaining bu...

  17. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  18. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  19. An analytic solution to LO coupled DGLAP evolution equations: a new pQCD tool

    CERN Document Server

    Block, Martin M; Ha, Phuoc; McKay, Douglas W

    2010-01-01

    We have analytically solved the LO pQCD singlet DGLAP equations using Laplace transform techniques. Newly-developed highly accurate numerical inverse Laplace transform algorithms allow us to write fully decoupled solutions for the singlet structure function F_s(x,Q^2)and G(x,Q^2) as F_s(x,Q^2)={\\cal F}_s(F_{s0}(x), G_0(x)) and G(x,Q^2)={\\cal G}(F_{s0}(x), G_0(x)). Here {\\cal F}_s and \\cal G are known functions of the initial boundary conditions F_{s0}(x) = F_s(x,Q_0^2) and G_{0}(x) = G(x,Q_0^2), i.e., the chosen starting functions at the virtuality Q_0^2. For both G and F_s, we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy, a computational fractional precision of O(10^{-9}). Armed with this powerful new tool in the pQCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F_s distributions, starting from their initial values at Q_0^2=1 GeV^2 and 1.69 GeV^2, respectively, using their ...

  20. Hybrid instrument development for an analytical laboratory

    International Nuclear Information System (INIS)

    The authors have been developing a hybrid densitometer for general laboratory application. This type of densitometer can be applied to concentration determinations of thorium, uranium, neptunium, plutonium, and americium. It can also be used to determine the ratios of any combination of these nuclear materials. This report describes the hardware and analysis approach. They will also describe some laboratory tests performed with the densitometer and present actual in-plant application results

  1. Measuring Tools for Quantifying Sustainable Development

    OpenAIRE

    Annette Evans; Vladimir Strezov; Tim Evans

    2015-01-01

    This work reviews the tools and methods used for quantifying sustainable development. The paper first reviews categorization of the tools based on weak and strong sustainability. It then provides critical review of the UN review of sustainability indicators and the methods for calculating the indicators, which include the environmental footprint, capital approach to measuring sustainable development, green national net product, genuine savings, genuine progress indicator, indicator of sustain...

  2. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing for a...... logic which is flexible and expandable. It then moves on to outline and discuss further development work. Finally, it offers a brief reflection on the potentials and shortcomings of the software – CityEngine – which is used for developing the parametric urban design tool....

  3. Development of health-related analytical techniques

    International Nuclear Information System (INIS)

    Related to the programme based on Nuclear Methods for Health-Related Monitoring of Trace Element Pollutants in Man initiated by I.A.E.A. in 1978, our laboratory (L.A.R.N.) has developed and optimized samples preparation techniques for hair analysis without dissolution and preconcentration or chemical separation, and PIXE non vacuum technique for measurement of biological samples such as liquids. The results obtained at L.A.R.N. have been compared with results obtained by the other techniques, and it is found that PIXE can be used with reliability to analyse in a short time a lot of biological samples. (author)

  4. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  5. Recent analytical developments for powder characterization

    Science.gov (United States)

    Brackx, E.; Pages, S.; Dugne, O.; Podor, R.

    2015-07-01

    Powders and divided solid materials are widely represented as finished or intermediary products in industries as widely varied as foodstuffs, cosmetics, construction, pharmaceuticals, electronic transmission, and energy. Their optimal use requires a mastery of the transformation process based on knowledge of the different phenomena concerned (sintering, chemical reactivity, purity, etc.). Their modelling and understanding need a prior acquisition of sets of data and characteristics which are more or less challenging to obtain. The goal of this study is to present the use of different physico-chemical characterization techniques adapted to uranium-containing powders analyzed either in a raw state or after a specific preparation (ionic polishing). The new developments touched on concern dimensional characterization techniques for grains and pores by image analysis, chemical surface characterization and powder chemical reactivity characterization. The examples discussed are from fabrication process materials used in the nuclear fuel cycle.

  6. Development of remote handling tools and equipment

    International Nuclear Information System (INIS)

    The remote handling (RH) tools and equipment development in ITER focuses mainly on the welding and cutting technique, weld inspection and double-seal door which are essential factors in the replacement of in-vessel components such as divertor and blanket. The conceptual design of these RH tools and equipment has been defined through ITER engineering design activity (EDA). Similarly, elementary R and D of the RH tools and equipment have been extensively performed to accumulate a technological data base for process and performance qualification. Based on this data, fabrications of full-scale RH tools and equipment are under progress. A prototypical bore tool for pipe welding and cutting has already been fabricated and is currently undergoing integrated performance tests. This paper describes the design outline of the RH tools and equipment related to in-vessel components maintenance, and highlights the current status of RH tools and equipment development by the Japan Home Team as an ITER R and D program. This paper also includes an outline of insulation joint and quick-pipe connector development, which has also been conducted through the ITER R and D program in order to standardize RH operations and components. (author)

  7. Remote tool development for nuclear dismantling operations

    International Nuclear Information System (INIS)

    Remote tool systems to undertake nuclear dismantling operations require careful design and development not only to perform their given duty but to perform it safely within the constraints imposed by harsh environmental conditions. Framatome ANP NUCLEAR SERVICES has for a long time developed and qualified equipment to undertake specific maintenance operations of nuclear reactors. The tool development methodology from this activity has since been adapted to resolve some very challenging reactor dismantling operations which are demonstrated in this paper. Each nuclear decommissioning project is a unique case, technical characterisation data is generally incomplete. The development of the dismantling methodology and associated equipment is by and large an iterative process combining design and simulation with feasibility and validation testing. The first stage of the development process involves feasibility testing of industrial tools and examining adaptations necessary to control and deploy the tool remotely with respect to the chosen methodology and environmental constraints. This results in a prototype tool and deployment system to validate the basic process. The second stage involves detailed design which integrates any remaining technical and environmental constraints. At the end of this stage, tools and deployment systems, operators and operating procedures are qualified on full scale mock ups. (authors)

  8. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    Science.gov (United States)

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results. PMID:27251852

  9. ICT Tools and Students' Competence Development

    Science.gov (United States)

    Fuglestad, Anne Berit

    2004-01-01

    In this paper I will present the rationale that motivates the study in an ongoing three-year project following students in school years 8 to 10. The aim is to develop the students' competence with use of ICT tools in mathematics in such a way that they will be able to choose tools for themselves, not rely just on the teacher telling them what to…

  10. Development of a transportation planning tool

    International Nuclear Information System (INIS)

    This paper describes the application of simulation modeling and logistics techniques to the development of a planning tool for the Department of Energy (DOE). The focus of the Transportation Planning Model (TPM) tool is to aid DOE and Sandia analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which will allow analysts to easily set up a shipment scenario and perform multiple ''what if'' evaluations. The TPM is being developed on personal computers using commercial off-the shelf (COTS) software tools under the WINDOWS reg-sign operating environment. Prototype development of the TPM has been completed

  11. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    Science.gov (United States)

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  12. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products. PMID:27394712

  13. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    International Nuclear Information System (INIS)

    Analytical and Process Chemistry (A ampersand PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A ampersand PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A ampersand PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A ampersand PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in

  14. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  15. The Information Needs of the Developing Countries: Analytical Case Studies.

    Science.gov (United States)

    Salman, Lamia

    1981-01-01

    Presents the generalized conclusions from analytical case studies undertaken by UNESCO and the United Nations Interim Fund for Science and Technology for Development (IFSTD) on the needs and options for access to scientific and technical information in eight developing countries. (Author/JL)

  16. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  17. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    Energy Technology Data Exchange (ETDEWEB)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  18. Statistic of extremes as an analytical tool in continuum γ-decay studies

    International Nuclear Information System (INIS)

    We apply the tools of the Statistics of Extremes to the nuclear continuum study, in our work we found an analytical function which is able to fit 'correctly' the energy ordered spectrum for the gamma rays from the decay in the continuum. The results suggest a way to extract experimentally physical parameters such as temperature and level density, which describe the nucleus to high excitation energy and high spin. We also show the possibility of distinguishing between different formalisms of level density and gamma strength. (Author)

  19. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    DEFF Research Database (Denmark)

    Vedel, Søren; Olesen, Laurits Højgaard; Bruus, Henrik

    2010-01-01

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we......-filled interconnected elastic microfluidic tubes containing a large, trapped air bubble and driven by a pulsatile pressure difference. We demonstrate good agreement between the system-level model and the experimental results, allowing us to determine the dynamic time scales of the system. However, the generic analysis...... can be applied to all microfluidic systems, both ac and dc....

  20. Measuring Tools for Quantifying Sustainable Development

    Directory of Open Access Journals (Sweden)

    Annette Evans

    2015-06-01

    Full Text Available This work reviews the tools and methods used for quantifying sustainable development. The paper first reviews categorization of the tools based on weak and strong sustainability. It then provides critical review of the UN review of sustainability indicators and the methods for calculating the indicators, which include the environmental footprint, capital approach to measuring sustainable development, green national net product, genuine savings, genuine progress indicator, indicator of sustainable economic welfare and human development indicator on sustainable development. The benefits of standardizing the assessment tools for sustainable development would be seen through well directed policy leading to a balance between economy, environment and society where none is compromised to achieve greater results in the other. However, there is still no single method of assessing the sustainability of development that is widely accepted as suitable and all methods developed have inadequacies that prevent a true measure of sustainable development from being determined. Keywords: Sustainability indicators, environmental footprint, genuine savings, sustainable economic welfare, human development indicator.

  1. Selection of design and operational parameters in spindle-holder-tool assemblies for maximum chatter stability by using a new analytical model

    OpenAIRE

    Ertürk, Alper; Erturk, Alper; Budak, Erhan; Özgüven, H. Nevzat; Ozguven, H. Nevzat

    2007-01-01

    In this paper, using the analytical model developed by the authors, the effects of certain system design and operational parameters on the tool point FRF, thus on the chatter stability are studied. Important conclusions are derived regarding the selection of the system parameters at the stage of machine tool design and during a practical application in order to increase chatter stability. It is demonstrated that the stability diagram for an application can be modified in a predictable manner ...

  2. Developing Adaptive Elearning: An Authoring Tool Design

    Directory of Open Access Journals (Sweden)

    Said Talhi

    2011-09-01

    Full Text Available Adaptive hypermedia is the answer to the lost in hyperspace syndrome, where the user has normally too many links to choose from, and little knowledge about how to proceed and select the most appropriate ones to him/her. Adaptive hypermedia thus offers a selection of links or content most appropriate to the user. Until very recently, little attention has been given to the complex task of authoring materials for Adaptive Educational Hypermedia. An author faces a multitude of problems when creating a personalized, rich learning experience for each user. The purpose of this paper is to present an authoring tool for adaptive hypermedia based courses. Designed to satisfy guidelines of accessibility of the W3C recommendation for authors and learners that present disabilities, the authoring tool allows several authors geographically dispersed to produce such courses together. It consists of a shared workspace gathering all tools necessary to the cooperative development task.

  3. H1640 caster tool development report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.A.

    1997-12-01

    This report describes the development and certification of the H1640 caster tool. This tool is used to rotate swivel caster wheels 90 degrees on bomb hand trucks or shipping containers. The B83 is a heavy bomb system and weighs close to 5,600 pounds for a two-high stack configuration. High castering moments (handle length times the force exerted on handle) are required to caster a wheel for a two-high stack of B83s. The H1640 is available to the DoD (Air Force) through the Special Equipment List (SEL) for the B83 as a replacement for the H631 and H1216 caster tools.

  4. Development of computerized risk management tool

    International Nuclear Information System (INIS)

    The author describes the kinds of efforts for the development of computerized risk management tool; (1) development of a risk monitor, Risk Monster, (2) improvement of McFarm (Missing Cutsets Finding Algorithm for Risk Monitor) and finally (3) development of reliability database management system, KwDBMan. Risk Monster supports for plant operators and maintenance schedulers to monitor plant risk and to avoid high peak risk by rearranging maintenance work schedule. Improved McFarm significantly improved calculation speed of Risk Monster for the cases of supporting system OOS (Out Of Service). KwDBMan manages event data, generic data and CCF (Common Cause Failure) data to support Risk Monster as well as PSA tool, KIRAP (KAERI Integrated Reliability Analysis Package)

  5. Xamarin as a tool for mobile development

    OpenAIRE

    Gridin, Oleksandr

    2015-01-01

    Xamarin as a tool for mobile development was chosen as a topic for the thesis work because of its fast pace of growth. This technology was founded in May 2011 and now counts more 1.25 million developers who have already proven its worth. With help of this technology mobile development process will come to the new qualitative level where crappy software won’t exist anymore. The main goals for this project were to show Xamarin’s power with creating cross-platform mobile application and to p...

  6. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    Science.gov (United States)

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-01

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  7. Development of Simulation Tool Orienting Production Engineering

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ning; NING Ru-xin; TANG Cheng-tong; LIANG Fu-jun

    2006-01-01

    A simulation tool named BITSIM orienting production engineering is developed in order to improve enterprise's productivity and making up the scarcity of computer application. The architecture of BITSIM is presented first. Hierarchical technique, control strategy based on multi-agent and simulation output analysis are depicted in detail then. In the end, an application example is taken out to prove that this system could be used to analyzing different hypothetical situation and configuring the auxiliary manufacturing system before production.

  8. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    International Nuclear Information System (INIS)

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we suggest the use of system-level equivalent circuit theory as an adequate theory of the behavior of the system. A novel pressure source capable of operation in the desired frequency range is presented for this generic analysis. As a proof of concept, we study the fairly complex system of water-filled interconnected elastic microfluidic tubes containing a large, trapped air bubble and driven by a pulsatile pressure difference. We demonstrate good agreement between the system-level model and the experimental results, allowing us to determine the dynamic time scales of the system. However, the generic analysis can be applied to all microfluidic systems, both ac and dc.

  9. Analytical tools for investigating strong-field QED processes in tightly focused laser fields

    CERN Document Server

    Di Piazza, A

    2015-01-01

    The present paper is the natural continuation of the letter [Phys. Rev. Lett. \\textbf{113}, 040402 (2014)], where the electron wave functions in the presence of a background electromagnetic field of general space-time structure have been constructed analytically, assuming that the initial energy of the electron is the largest dynamical energy scale in the problem and having in mind the case of a background tightly focused laser beam. Here, we determine the scalar and the spinor propagators under the same approximations, which are useful tools for calculating, e.g., total probabilities of processes occurring in such complex electromagnetic fields. In addition, we also present a simpler and more general expression of the electron wave functions found in [Phys. Rev. Lett. \\textbf{113}, 040402 (2014)] and we indicate a substitution rule to obtain them starting from the well-known Volkov wave functions in a plane-wave field.

  10. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Directory of Open Access Journals (Sweden)

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  11. Continuous wave free precession Practical analytical tool for low-resolution nuclear magnetic resonance measurements

    International Nuclear Information System (INIS)

    The use of continuous wave free precession (CWFP) as a practical analytical tool for quantitative determinations in low-resolution nuclear magnetic resonance (LRNMR) is examined. The requirements of this technique are shown to be no more demanding than those prevailing in free-induction decay or spin-echo measurements. It is shown that the substantial gain in signal to noise ratio for a given acquisition time permitted by CWFP, can be exploited with advantage in practically any application of LRNMR. This applies not only to homogeneous low viscosity liquid samples but also to multi-component systems where differences in relaxation times of each component permit a separation of the individual contributions. As an example, the use of CWFP for fast quantitative determination of oil and moisture in various seeds is presented

  12. Development of manufacturing and application system for cutting tools made of polycrystalline diamond and cubic boron nitride

    Science.gov (United States)

    Werner, G.

    1982-12-01

    The improvement of the technology of making tools from ultrahard polycrystalline materials and to develop the pertinent conditions of application are discussed. High performance tools applying combined empirical/analytical methods. The following results are obtained: (1) development of high performance diamond grinding wheels and establishment of optimum machining conditions for making tools from polycrystalline material; (2) system development of standardized and special tools from polycrystalline cutting material; (3) investigation on wear and tool life of dressing tools made from polycrystalline diamond, and development of improved dressing tools.

  13. Developments and retrospectives in Lie theory geometric and analytic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph

    2014-01-01

    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Geometric and Analytic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current re...

  14. SE Requirements Development Tool User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Faith Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Department of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.

  15. Constructing Subjects, Producing Subjectivities: Developing Analytic Needs in Discursive Psychology

    OpenAIRE

    McAvoy, Jean

    2007-01-01

    The publication of Potter and Wetherell’s (1987) blueprint for a discursive social psychology was a pivotal moment in the discursive turn in psychology. That transformational text went on to underpin much contemporary discursive psychology; paving the way for what has become an enriching range of analytic approaches, and epistemological and ontological arguments (Wetherell, Taylor and Yates, 2001a; 2001b). Twenty years on, and as discursive psychology continues to develop, t...

  16. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  17. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. PMID:26304440

  18. Analytic autoethnography:a tool to inform the lecturer’s use of self when teaching mental health nursing?

    OpenAIRE

    Struthers, John

    2012-01-01

    This research explores the value of analytic autoethnography to develop the lecturer’s use of self when teaching mental health nursing. Sharing the lecturer’s selfunderstanding developed through analytic reflexivity focused on their autoethnographic narrative offers a pedagogical approach to contribute to the nursing profession’s policy drive to increase the use of reflective practices. The research design required me to develop my own analytic autoethnography. Four themes emerged from the da...

  19. RDF Analytics: Lenses over Semantic Graphs

    OpenAIRE

    Colazzo, Dario; Goasdoué, François; Manolescu, Ioana; Roatis, Alexandra

    2014-01-01

    The development of Semantic Web (RDF) brings new requirements for data analytics tools and methods, going beyond querying to semantics-rich analytics through warehouse-style tools. In this work, we fully redesign, from the bottom up, core data analytics concepts and tools in the context of RDF data, leading to the first complete formal framework for warehouse-style RDF analytics. Notably, we define i) analytical schemas tailored to heterogeneous, semantics-rich RDF graph, ii) analytical queri...

  20. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  1. Optimization technique as a tool for implementing analytical quality by Design

    Directory of Open Access Journals (Sweden)

    C. MOHAN REDDY

    2013-09-01

    Full Text Available A process is well understood when all critical sources of variability are identified and explained, variability is managed by the process, and product quality attributes can be accurately and reliably predicted over the design space. Quality by Design (QbD is a systematic approach to development of products and processes that begins with predefined objectives and emphasizes product and process understanding and process control based on sound science, statistical methods and quality risk management. In an attempt to curb rising development costs and regulatory barriers to innovation and creativity, the FDA and ICH have recently started promoting QbD in the pharmaceutical industry. QbD is partially based on the application of statistical Design of Experiments strategy to the development of both analytical methods and pharmaceutical formulations. The present work describes the development of robust HPLC method for analysis of Eplerenone formulation under QbD approach using Design of Experiments.

  2. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  3. Networks as Tools for Sustainable Urban Development

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    Due to the increasing number of networks related to sustainable development (SUD) the paper focuses on understanding in which way networks can be considered useful tools for sustainable urban development, taking particularly into consideration the networks potential of spreading innovative policies...... are involved in. By applying the GREMI2-theories of “innovative milieux” (Aydalot, 1986; Camagni, 1991) to the case study, we will suggest some reasons for the benefits achieved by the Dogme-network, compared to other networks. This analysis will point to the existence of an “innovative milieu” on...... sustainability within the network, and on the political commitment in the network, where all progress is being measured and audited. From this, we find many parallels between the pre-conditions for an industrial innovative milieu, and the pre-conditions for an innovative municipally based network. Based on the...

  4. Addressing new analytical challenges in protein formulation development.

    Science.gov (United States)

    Mach, Henryk; Arvinte, Tudor

    2011-06-01

    As the share of therapeutic proteins in the arsenal of modern medicine continue increasing, relatively little progress has been made in the development of analytical methods that would address specific needs encountered during the development of these new drugs. Consequently, the researchers resort to adaptation of existing instrumentation to meet the demands of rigorous bioprocess and formulation development. In this report, we present a number of such adaptations as well as new instruments that allow efficient and precise measurement of critical parameters throughout the development stage. The techniques include use of atomic force microscopy to visualize proteinacious sub-visible particles, use of extrinsic fluorescent dyes to visualize protein aggregates, particle tracking analysis, determination of the concentration of monoclonal antibodies by the analysis of second-derivative UV spectra, flow cytometry for the determination of subvisible particle counts, high-throughput fluorescence spectroscopy to study phase separation phenomena, an adaptation of a high-pressure liquid chromatography (HPLC) system for the measurement of solution viscosity and a variable-speed streamlined analytical ultracentrifugation method. An ex vivo model for understanding the factors that affect bioavailability after subcutaneous injections is also described. Most of these approaches allow not only a more precise insight into the nature of the formulated proteins, but also offer increased throughput while minimizing sample requirements. PMID:21392580

  5. A Survey on Big Data Analytics: Challenges, Open Research Issues and Tools

    OpenAIRE

    D. P. Acharjya; Kauser Ahmed P

    2016-01-01

    A huge repository of terabytes of data is generated each day from modern information systems and digital technolo-gies such as Internet of Things and cloud computing. Analysis of these massive data requires a lot of efforts at multiple levels to extract knowledge for decision making. Therefore, big data analysis is a current area of research and development. The basic objective of this paper is to explore the potential impact of big data challenges, open research issues, and various tools ass...

  6. Developing neutronics calculation tools for MYRRHA

    International Nuclear Information System (INIS)

    The design of the Accelerator Driven System MYRRHA requires adequate and specialised tools in the field of neutronics calculations. In order to fill the gaps, several PhD programmes were launched. In 2005 three such PhD projects were running. Each of them focuses on different stages in the computation of a core of MYRRHA. The first project Improvements of the spallation reaction model, a collaboration with the University of Liege, deals with the characterisation of the spallation neutron source using the INCL (Intra-Nuclear Cascade of Liege) model. Since at high energies, nuclear data are sparse, calculations rely on models. Especially for spallation reactions that occur at proton energies of several hundreds of MeV, models are the only means to evaluate the spallation source in MYRRHA. The second project 'Neutron transport with anisotropic scattering', a collaboration with the Universite Libre de Bruxelles, works on the development of a neutronics code, CASE-BSM, for systems with highly anisotropic scattering. The presence in large amounts of both lead and bismuth atoms in the MYRRHA core results in a highly anisotropic scattering of the neutrons in the bulk of the coolant. Neglecting this effect has large consequences on both global parameters, like keff, as well as on local parameters, like the neutron flux seen by the vessel. The third project, 'ALEPH: An integrated Monte Carlo bun-up tool', a collaboration with Ghent University, treats the last phase of a core calculation: the depletion of the fuel during irradiation. For an experimental machine like MYRRHA it is of utmost importance to have a fast calculational tool to evaluate the incineration of both isotopes present in the fuel as isotopes present in experimental devices. The main objective is to improve the current quality of the neutronics codes focused on ADS applications and to have this knowledge 'in-house'

  7. Organic analysis and analytical methods development: FY 1995 progress report

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Hoopes, V.; Rau, J. [and others

    1995-09-01

    This report describes the status of organic analyses and developing analytical methods to account for the organic components in Hanford waste tanks, with particular emphasis on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-103 (Tank 103-SY). The analytical data are to serve as an example of the status of methods development and application. Samples of the convective and nonconvective layers from Tank 103-SY were analyzed for total organic carbon (TOC). The TOC value obtained for the nonconvective layer using the hot persulfate method was 10,500 {mu}g C/g. The TOC value obtained from samples of Tank 101-SY was 11,000 {mu}g C/g. The average value for the TOC of the convective layer was 6400 {mu}g C/g. Chelator and chelator fragments in Tank 103-SY samples were identified using derivatization. gas chromatography/mass spectrometry (GC/MS). Organic components were quantified using GC/flame ionization detection. Major components in both the convective and nonconvective-layer samples include ethylenediaminetetraacetic acid (EDTA), nitrilotriacetic acid (NTA), succinic acid, nitrosoiminodiacetic acid (NIDA), citric acid, and ethylenediaminetriacetic acid (ED3A). Preliminary results also indicate the presence of C16 and C18 carboxylic acids in the nonconvective-layer sample. Oxalic acid was one of the major components in the nonconvective layer as determined by derivatization GC/flame ionization detection.

  8. Volume, Variety and Veracity of Big Data Analytics in NASA's Giovanni Tool

    Science.gov (United States)

    Lynnes, C.; Hegde, M.; Smit, C.; Pan, J.; Bryant, K.; Chidambaram, C.; Zhao, P.

    2013-12-01

    Earth Observation data have posed challenges to NASA users ever since the launch of several satellites around the turn of the century, generating volumes now measured in petabytes, a volume growth further increased by models assimilating the satellite data. One important approach to bringing Big Data Analytic capabilities to bear on the Volume of data has been the provision of server-side analysis capabilities. For instance, the Geospatial Interactive Online Visualization ANd aNalysis (Giovanni) tool provides a web interface to large volumes of gridded data from several EOSDIS data centers. Giovanni's main objective is to allow the user to explore its data holdings using various forms of visualization and data summarization or aggregation algorithms, thus allowing the user to examine statistics and pictures for the overall data, while eventually acquiring only the most useful data. Thus much of the preprocessing and data reduction aspects can take place on the server, delivering manageable information quantities to the user. In addition to Volume, Giovanni uses open standards to tackle the Variety aspect of Big Data, incorporating data stored in several formats, from several data centers, and making them available in a uniform data format and structure to both the Giovanni algorithms and the end user. The Veracity aspect of Big Data, perhaps the stickiest of wickets, is enhanced through features that enable reproducibility (provenance and URL-driven workflows), and by a Help Desk staffed by scientists with expertise in the science data.

  9. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    Science.gov (United States)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  10. Multiple Forces Driving China's Economic Development: A New Analytic Framework

    Institute of Scientific and Technical Information of China (English)

    Yahua Wang; Angang Hu

    2007-01-01

    Based on economic growth theory and the World Bank's analytical framework relating to the quality of growth, the present paper constructs a framework that encompasses physical, international, human, natural and knowledge capital to synthetically interpret economic development. After defining the five types of capital and total capital, we analyze the dynamic changes of these types of capital in China and in other countries. The results show that since China's reform and opening up, knowledge, international, human and physical capital have grown rapidly, with speeds of growth higher than that of economic growth. As the five types of capital have all increased at varying paces, the savings level of total capital in China has quadrupled in 25 years and overtook that of the USA in the 1990s. The changes in the five types of capital and total capital reveal that there are progressively multiple driving forces behind China's rapid economic development. Implications for China's long-term economic development are thereby raised.

  11. Development of the SOFIA Image Processing Tool

    Science.gov (United States)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  12. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  13. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    Science.gov (United States)

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  14. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  15. Micro-fluidic tools for the liquid-liquid extraction of radionuclides in analytical procedures

    International Nuclear Information System (INIS)

    The analyses of radionuclides are in great demand and a cost effective technique for the separation of analytes is required. A micro-scale reactor composed of microchannels fabricated in a microchip was chosen to investigate liquid-liquid extraction reactions driven by three different families of metal extractants: neutral, acidic and ion-pair extractants. The extraction conditions in the micro-fluidic device were considered. These investigations demonstrated that the conventional methodology used for solvent extraction in macro-scale reactors is not directly transposable to micro liquid-liquid extraction systems. However, it is expected that the understanding of the chemical and physical phenomena involved in a reference extraction systems studied in a given selected lab-on-chip will lead us to develop and validate a methodology suitable to miniaturized reactors. (authors)

  16. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  17. An analytic framework for developing inherently-manufacturable pop-up laminate devices

    International Nuclear Information System (INIS)

    Spurred by advances in manufacturing technologies developed around layered manufacturing technologies such as PC-MEMS, SCM, and printable robotics, we propose a new analytic framework for capturing the geometry of folded composite laminate devices and the mechanical processes used to manufacture them. These processes can be represented by combining a small set of geometric operations which are general enough to encompass many different manufacturing paradigms. Furthermore, such a formulation permits one to construct a variety of geometric tools which can be used to analyze common manufacturability concepts, such as tool access, part removability, and device support. In order to increase the speed of development, reduce the occurrence of manufacturing problems inherent with current design methods, and reduce the level of expertise required to develop new devices, the framework has been implemented in a new design tool called popupCAD, which is suited for the design and development of complex folded laminate devices. We conclude with a demonstration of utility of the tools by creating a folded leg mechanism. (paper)

  18. ANALYTICAL MODEL OF CALCULUS FOR INFLUENCE THE TRANSLATION GUIDE WEAR OVER THE MACHINING ACCURACY ON THE MACHINE TOOL

    Directory of Open Access Journals (Sweden)

    Ivona PETRE

    2010-10-01

    Full Text Available The wear of machine tools guides influences favorably to vibrations. As a result of guides wear, the initial trajectory of cutting tools motion will be modified, the generating dimensional accuracy discrepancies and deviations of geometrical shape of the work pieces. As it has already been known, the wear of mobile and rigid guides is determined by many parameters (pressure, velocity, friction length, lubrication, material. The choice of one or another analytic model and/or the experimental model of the wear is depending by the working conditions, assuming that the coupling material is known.The present work’s goal is to establish an analytic model of calculus showing the influence of the translation guides wear over the machining accuracy on machine-tools.

  19. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    Science.gov (United States)

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  20. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  1. A Survey on Big Data Analytics: Challenges, Open Research Issues and Tools

    Directory of Open Access Journals (Sweden)

    D. P. Acharjya

    2016-02-01

    Full Text Available A huge repository of terabytes of data is generated each day from modern information systems and digital technolo-gies such as Internet of Things and cloud computing. Analysis of these massive data requires a lot of efforts at multiple levels to extract knowledge for decision making. Therefore, big data analysis is a current area of research and development. The basic objective of this paper is to explore the potential impact of big data challenges, open research issues, and various tools associated with it. As a result, this article provides a platform to explore big data at numerous stages. Additionally, it opens a new horizon for researchers to develop the solution, based on the challenges and open research issues.

  2. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  3. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  4. Development of automated analytical system using robots. Development of automated spectrophotometer

    International Nuclear Information System (INIS)

    An automated analytical system for the measurement of U and Pu concentrations, acidity and radioactivity has been developed since 1993 at the Tokai Reprocessing Plant. The total system is composed of three units, the unit of spectrophotometry, that of titration and that of radioactivity counting. Each unit consists of a robot, an analytical instrument and a computer. The robot operates pretreatment of the samples and to set them into the analytical instrument. The personal computer is used to control the robot and the analytical instrument, and also send the measurement results to the host computer. This report describes the present status of the development of the system and the results of the basic test for the spectrophotometry unit. (author)

  5. Implementing analytics a blueprint for design, development, and adoption

    CERN Document Server

    Sheikh, Nauman

    2013-01-01

    Implementing Analytics demystifies the concept, technology and application of analytics and breaks its implementation down to repeatable and manageable steps, making it possible for widespread adoption across all functions of an organization. Implementing Analytics simplifies and helps democratize a very specialized discipline to foster business efficiency and innovation without investing in multi-million dollar technology and manpower. A technology agnostic methodology that breaks down complex tasks like model design and tuning and emphasizes business decisions

  6. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  7. Non-invasive tools for measuring metabolism and biophysical analyte transport: self-referencing physiological sensing.

    Science.gov (United States)

    McLamore, Eric S; Porterfield, D Marshall

    2011-11-01

    Biophysical phenomena related to cellular biochemistry and transport are spatially and temporally dynamic, and are directly involved in the regulation of physiology at the sub-cellular to tissue spatial scale. Real time monitoring of transmembrane transport provides information about the physiology and viability of cells, tissues, and organisms. Combining information learned from real time transport studies with genomics and proteomics allows us to better understand the functional and mechanistic aspects of cellular and sub-cellular systems. To accomplish this, ultrasensitive sensing technologies are required to probe this functional realm of biological systems with high temporal and spatial resolution. In addition to ongoing research aimed at developing new and enhanced sensors (e.g., increased sensitivity, enhanced analyte selectivity, reduced response time, and novel microfabrication approaches), work over the last few decades has advanced sensor utility through new sensing modalities that extend and enhance the data recorded by sensors. A microsensor technique based on phase sensitive detection of real time biophysical transport is reviewed here. The self-referencing technique converts non-invasive extracellular concentration sensors into dynamic flux sensors for measuring transport from the membrane to the tissue scale. In this tutorial review, we discuss the use of self-referencing micro/nanosensors for measuring physiological activity of living cells/tissues in agricultural, environmental, and biomedical applications comprehensible to any scientist/engineer. PMID:21761069

  8. The Development of An Analytical Overlay Design Procedure

    Directory of Open Access Journals (Sweden)

    Djunaedi Kosasih

    2008-01-01

    Full Text Available Pavement structural evaluation using pavement modulus values resulting from back calculation process on non-destructive deflection data has been adopted to quantify objectively the conditions of existing pavements under various traffic loading and environmental conditions. However, such an advanced technique is not yet followed widely by advances in analytical overlay design procedures. One possible reason is perhaps due to its requirement to perform complex computations. A new module of computer program BackCalc has been developed to do that task based on the allowable maximum deflection criterion specified by the Asphalt Institute’83. The rationale is that adequate overlay thickness will be computed by iteration to result in theoretical maximum deflection that closely matches against the specified allowable maximum deflection. This paper outlines the major components of the program module illustrated by using a practical example. The overlay thickness obtained was found to be comparable with that of the known AASHTO’93 method

  9. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  10. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  11. Development of analytical orbit propagation technique with drag

    Science.gov (United States)

    1979-01-01

    Two orbit computation methods were used: (1) numerical method- The solution to the satellite differential equations were solved in a step-by-step manner, using a mathematical algorithm taken from numerical analysis; and (2) analytical method - The solution was expressed by explicit functions of the independent variable. Analytical drag modules, tesseral terms initialization module, second order and long period terms module, and verification testing of the ASOP program were also considered.

  12. NASTRAN as an analytical research tool for composite mechanics and composite structures

    Science.gov (United States)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  13. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  14. Balanced Scorecard – Sustainable Development Tool

    OpenAIRE

    Leontina Beţianu; Sorin Briciu

    2011-01-01

    The sustainable management of a business requires the consideration of all the business compo- nents, both the economic activity and the aspects related to its impact on the environment and its social implications. The Balanced Scorecard (BSC) is a management tool supporting the successful implementation of corporative strategies. This helps connecting operational and non-financial activi- ties that have a significant impact on the economic success of a business. BSC is therefore a promising ...

  15. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  16. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  17. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  18. Development of microfluidic tools for cell analysis

    Czech Academy of Sciences Publication Activity Database

    Václavek, Tomáš; Křenková, Jana; Foret, František

    Brno: Ústav analytické chemie AV ČR, v. v. i, 2015 - (Foret, F.; Křenková, J.; Drobníková, I.; Klepárník, K.), s. 209-211 ISBN 978-80-904959-3-7. [CECE 2015. International Interdisciplinary Meeting on Bioanalysis /12./. Brno (CZ), 21.09.2015-23.09.2015] R&D Projects: GA ČR(CZ) GBP206/12/G014; GA ČR(CZ) GA14-06319S Institutional support: RVO:68081715 Keywords : microfluidic device * 3D- printing * single cell analysis Subject RIV: CB - Analytical Chemistry, Separation http://www.ce-ce.org/CECE2015/CECE%202015%20proceedings_full.pdf

  19. Solvent-free microwave extraction of bioactive compounds provides a tool for green analytical chemistry

    OpenAIRE

    Ying LI; Fabiano-Tixier, Anne-Sylvie; Vian, Maryline; Chemat, Farid

    2013-01-01

    We present an overview on solvent-free microwave-extraction techniques of bioactive compounds from natural products. This new technique is based on the concept of green analytical chemistry. It has proved to be an alternative to other techniques with the advantages of reducing extraction times, energy consumption, solvent use and CO2 emissions.

  20. Developing A SPOT CRM Debriefing Tool

    Science.gov (United States)

    Martin, Lynne; Villeda, Eric; Orasanu, Judith; Connors, Mary M. (Technical Monitor)

    1998-01-01

    In a study of CRM LOFT briefings published in 1997, Dismukes, McDonnell & Jobe reported that briefings were not being utilized as fully as they could be and that crews may not be getting the full benefit from LOFT that is possible. On the basis of their findings, they suggested a set of general guidelines for briefings for the industry. Our work builds on this study to try to provide a specific debriefing tool which provides a focus for the strategies that Dismukes et al suggest.

  1. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    Science.gov (United States)

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons. PMID:25350022

  2. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  3. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  4. 100-F Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  5. 100-K Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  6. Apply Web-based Analytic Tool and Eye Tracking to Study The Consumer Preferences of DSLR Cameras

    Directory of Open Access Journals (Sweden)

    Jih-Syongh Lin

    2013-11-01

    Full Text Available Consumer’s preferences and purchase motivation of products often lie in the purchasing behaviors generated by the synthetic evaluation of form features, color, function, and price of products. If an enterprise can bring these criteria under control, they can grasp the opportunities in the market place. In this study, the product form, brand, and prices of five DSLR digital cameras of Nikon, Lumix, Pentax, Sony, and Olympus were investigated from the image evaluation and eye tracking. The web-based 2-dimensional analytical tool was used to present information on three layers. Layer A provided information of product form and brand name; Layer B for product form, brand name, and product price for the evaluation of purchase intention (X axis and product form attraction (Y axis. On Layer C, Nikon J1 image samples of five color series were presented for the evaluation of attraction and purchase intention. The study results revealed that, among five Japanese brands of digital cameras, LUMIX GF3 is most preferred and serves as the major competitive product, with a product price of US$630. Through the visual focus of eye-tracking, the lens, curvatured handle bar, the curve part and shuttle button above the lens as well as the flexible flash of LUMIX GF3 are the parts that attract the consumer’s eyes. From the verbal descriptions, it is found that consumers emphasize the functions of 3D support lens, continuous focusing in shooting video, iA intelligent scene mode, and all manual control support. In the color preference of Nikon J1, the red and white colors are most preferred while pink is least favored. These findings can serve as references for designers and marketing personnel in new product design and development.

  7. Development of ecohydrological assessment tool and its application

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The development of Hydro-Informatic Modelling System (HIMS) provides an integrated platform for hydrological simulation. To extend the application of HIMS, an ecohydrological modeling system named ecohydrological assessment tool (EcoHAT) has been developed. Integrating parameter-management tools, RS (remote sensing) inversion tools, module-design tools and GIS analysis tools, the EcoHAT provides an integrated tool to simulate ecohydrological processes on regional scale, which develops a new method on sustainable use of water. EcoHAT has been applied to several case studies, such as, the Yellow River Basin, the acid deposition area in Guizhou province and the riparian catchment of Guanting reservoir in Beijing. Results prove that EcoHAT can efficiently simulate and analysis the ecohydrological processes on regional scale and provide technical support to integrated water resources management on basin scale.

  8. Development of ecohydrological assessment tool and its application

    Institute of Scientific and Technical Information of China (English)

    LIU ChangMing; YANG ShengTian; WEN ZhiQun; WANG XueLei; WANG YuJuan; LI Qian; SHENG HaoRan

    2009-01-01

    The development of Hydro-Informatic Modelling System (HIMS) provides an integrated platform for hydrological simulation. To extend the application of HIMS, an ecohydrological modeling system named ecohydrological assessment tool (EcoHAT) has been developed, integrating parameter-management tools, RS (remote sensing) inversion tools, module-design tools and GIS analysis tools, the EcoHAT provides an integrated tool to simulate ecohydrological processes on regional scale, which develops a new method on sustainable use of water. EcoHAT has been applied to several case studies,such as, the Yellow River Basin, the acid deposition area in Guizhou province and the riparian catchment of Guanting reservoir in Beijing. Results prove that EcoHAT can efficiently simulate and analysis the ecohydrological processes on regional scale and provide technical support to integrated water resources management on basin scale.

  9. Development of Regional Excel-Based Stormwater/Nutrient BMP Optimization Tool (Opti-Tool)

    Science.gov (United States)

    During 2014, EPA Region 1 contracted with Tetra Tech, Inc. to work with a regional technical Advisory Committee to develop an Excel-based stormwater/nutrient BMP optimization tool (Opti-Tool) using regional precipitation data and regionally calibrated BMP performance data from UN...

  10. Development of micro-grinding mechanics and machine tools

    Science.gov (United States)

    Park, Hyung Wook

    Micro-grinding with microscale machine tools is a micro-machining process in precision manufacturing of microscale parts such as micro sensors, micro actuators, micro fluidic devices, and micro machine parts. Mechanical micro-machining generally consists of various material removal processes. Micro-grinding of these processes is typically the final process step and it provides a competitive edge over other fabrication processes. The quality of the parts produced by this process is affected by process conditions, micro-grinding wheel properties, and microstructure of materials. Although a micro-grinding process resembles a traditional grinding process, this process is distinctive due to the size effect in micro-machining because the mechanical and thermal interactions between a single grit and a workpiece are related to the phenomena observed in micro-machining. However, there have not been enough modeling studies of the micro-grinding process and as a result, little knowledge base on this area has been accumulated. In this study, the new predictive model for the micro-grinding process was developed by consolidating mechanical and thermal effects within the single grit interaction model at microscale material removal. The size effect of micro-machining was also included in the proposed model. In order to assess thermal effects, the heat partition ratio was experimentally calibrated and compared with the prediction of the Hahn model. Then, on the basis of this predictive model, a comparison between experimental data and analytical predictions was conducted in view of the overall microgrinding forces in the x and y directions. Although there are deviations in the predicted micro-grinding forces at low depths of cut, these differences are reduced as the depth of cut increases. On the other hand, the optimization of micro machine tools was performed on the basis of the proposed design strategy. Individual mathematical modeling of key parameters such as volumetric error

  11. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    Directory of Open Access Journals (Sweden)

    Daniel Cozzolino

    2015-07-01

    Full Text Available Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR and mid infrared (MIR spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants.

  12. Developing molecular tools for Chlamydomonas reinhardtii

    Science.gov (United States)

    Noor-Mohammadi, Samaneh

    Microalgae have garnered increasing interest over the years for their ability to produce compounds ranging from biofuels to neutraceuticals. A main focus of researchers has been to use microalgae as a natural bioreactor for the production of valuable and complex compounds. Recombinant protein expression in the chloroplasts of green algae has recently become more routine; however, the heterologous expression of multiple proteins or complete biosynthetic pathways remains a significant challenge. To take full advantage of these organisms' natural abilities, sophisticated molecular tools are needed to be able to introduce and functionally express multiple gene biosynthetic pathways in its genome. To achieve the above objective, we have sought to establish a method to construct, integrate and express multigene operons in the chloroplast and nuclear genome of the model microalgae Chlamydomonas reinhardtii. Here we show that a modified DNA Assembler approach can be used to rapidly assemble multiple-gene biosynthetic pathways in yeast and then integrate these assembled pathways at a site-specific location in the chloroplast, or by random integration in the nuclear genome of C. reinhardtii. As a proof of concept, this method was used to successfully integrate and functionally express up to three reporter proteins (AphA6, AadA, and GFP) in the chloroplast of C. reinhardtii and up to three reporter proteins (Ble, AphVIII, and GFP) in its nuclear genome. An analysis of the relative gene expression of the engineered strains showed significant differences in the mRNA expression levels of the reporter genes and thus highlights the importance of proper promoter/untranslated-region selection when constructing a target pathway. In addition, this work focuses on expressing the cofactor regeneration enzyme phosphite dehydrogenase (PTDH) in the chloroplast and nuclear genomes of C. reinhardtii. The PTDH enzyme converts phosphite into phosphate and NAD(P)+ into NAD(P)H. The reduced

  13. Analytical Method Development & Validation for Related Substances Method of Busulfan Injection by Ion Chromatography Method

    Directory of Open Access Journals (Sweden)

    Rewaria S

    2013-05-01

    Full Text Available A new simple, accurate, precise and reproducible Ion chromatography method has been developed forthe estimation of Methane sulfonic acid in Busulfan injectable dosage. The method which is developedis also validated in complete compliance with the current regulatory guidelines by using well developedanalytical method validation techniques and tools which comprises with the analytical method validationparameters like Linearity, LOD and LOQ determination, Accuracy, Method precision, Specificity,System suitability, Robustness, Ruggedness etc. by adopting the current method the linearity obtained isnear to 0.999 and thus this shows that the method is capable to give a good detector response, therecovery calculated was within the range of 85% to 115% of the specification limits.

  14. Analytical developments in ICP-MS for arsenic and selenium speciation. Application to granitic waters

    International Nuclear Information System (INIS)

    Nuclear waste storage in geological areas needs the understanding of the physico-chemistry of groundwaters interactions with surrounding rocks. Redox potential measurements and speciation, calculated from geochemical modelling are not significant for the determination of water reactivity. We have thus chosen to carry out experimental speciation by developing sensitive analytical tools with respect of specie chemical identity. We have studied two redox indicators from reference sites (thermal waters from Pyrenees, France): arsenic and selenium. At first, we have determined the concentrations in major ions (sulphide, sulphate, chloride, fluoride, carbonate, Na, K, Ca). Speciation was conducted by HPLC hyphenated to quadrupole ICP-MS and high resolution ICP-MS. These analyses have shown the presence of two new arsenic species in solution, in addition of a great reactivity of these waters during stability studies. A sampling, storage and analysis method is described. (author)

  15. [COMETE: a tool to develop psychosocial competences in patient education].

    Science.gov (United States)

    Saugeron, Benoit; Sonnier, Pierre; Marchais, Stéphanie

    2016-01-01

    This article presents a detailed description of the development and use of the COMETE tool. The COMETE tool is designed to help medical teams identify, develop or evaluate psychosocial skills in patient education and counselling. This tool, designed in the form of a briefcase, proposes methodological activities and cards that assess psychosocial skills during a shared educational assessment, group meetings or during an individual evaluation. This tool is part of a support approach for medical teams caring for patients with chronic diseases. PMID:27392049

  16. Developing a 300C Analog Tool for EGS

    Energy Technology Data Exchange (ETDEWEB)

    Normann, Randy

    2015-03-23

    This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.

  17. Towards a Process for Developing Maintenance Tools in Academia

    CERN Document Server

    Kienle, Holger M

    2008-01-01

    Building of tools--from simple prototypes to industrial-strength applications--is a pervasive activity in academic research. When proposing a new technique for software maintenance, effective tool support is typically required to demonstrate the feasibility and effectiveness of the approach. However, even though tool building is both pervasive and requiring significant time and effort, it is still pursued in an ad hoc manner. In this paper, we address these issues by proposing a dedicated development process for tool building that takes the unique characteristics of an academic research environment into account. We first identify process requirements based on a review of the literature and our extensive tool building experience in the domain of maintenance tools. We then outline a process framework based on work products that accommodates the requirements while providing needed flexibility for tailoring the process to account for specific tool building approaches and project constraints. The work products are...

  18. DEVELOPMENT OF SOLUBILITY PRODUCT VISUALIZATION TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    T.F. Turner; A.T. Pauli; J.F. Schabron

    2004-05-01

    Western Research Institute (WRI) has developed software for the visualization of data acquired from solubility tests. The work was performed in conjunction with AB Nynas Petroleum, Nynashamn, Sweden who participated as the corporate cosponsor for this Jointly Sponsored Research (JSR) task. Efforts in this project were split between software development and solubility test development. The Microsoft Windows-compatible software developed inputs up to three solubility data sets, calculates the parameters for six solid body types to fit the data, and interactively displays the results in three dimensions. Several infrared spectroscopy techniques have been examined for potential use in determining bitumen solubility in various solvents. Reflectance, time-averaged absorbance, and transmittance techniques were applied to bitumen samples in single and binary solvent systems. None of the techniques were found to have wide applicability.

  19. PLS2 regression as a tool for selection of optimal analytical modality

    DEFF Research Database (Denmark)

    Madsen, Michael; Esbensen, Kim

    technologies and price classes are able to decipher relevant process information simultaneously. The question then is: how to choose between available technologies without compromising the quality and usability of the data. We apply PLS2 modelling to quantify the relative merits of competing, or complementing......, analytical modalities. We here present results from a feasibility study, where Fourier Transform Near InfraRed (FT-NIR), Fourier Transform Mid InfraRed (FT-MIR), and Raman laser spectroscopy were applied on the same set of samples obtained from a pilot-scale beer brewing process. Quantitative PLS1 models...

  20. Analytical and clinical performances of immunoradiometric assay of total and free PSA developed locally

    International Nuclear Information System (INIS)

    A specific assay was developed for total and free PSA (PSAt, PSAf). Both assay use a two site IRMA with polyclonal anti PSA antibodies coated on tubes. Polyclonal antibodies were obtained after rabbit's immunisation using an under skin injection of pure PSA in multiple site. For quantification, two monoclonal antibodies were selected, the first highly specific to free PSA and the second recognising both free and bound PSA. A correlation study was performed comparatively with two commercial kits from CIS Bio and Immunotech. For that purpose, 464 serums samples ranging from 0.5 ng/ml to 3399 ng/ml were used to characterise the analytical performance of the new test. The analytical detection limit of the new test was equal to 0.05 ng/ml for the total PSA and 0.02ng/ml for the free PSA. The within run and between-day coefficients of variation were to 20 ng/ml. For BPH, no significant difference was found between the three test for the ratio PSAf/PSAt using a cut off of 14% (all were>to 14%). For the 120 patients with PC, all PSAt were > to 2 ng/ml. However the mean value of PSAt was higher for the commercial kits (14.74 ng/ml against 12.48ng/ml for the new test) but all ratio of PSAf/PSAt for the 120 newly diagnosed cancer were <14%. In conclusion, our immunoradiometric assay developed locally has a good analytical performance and its outputs are well correlated to clinical findings in prostate disease. Furthermore, a cut off of 14% for the ratio PSAf/PSAt appears to be the most accurate tools to depict a prostate cancer

  1. Development of a fusion approach selection tool

    Science.gov (United States)

    Pohl, C.; Zeng, Y.

    2015-06-01

    During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.

  2. Tools for Nanotechnology Education Development Program

    Energy Technology Data Exchange (ETDEWEB)

    Dorothy Moore

    2010-09-27

    The overall focus of this project was the development of reusable, cost-effective educational modules for use with the table top scanning electron microscope (TTSEM). The goal of this project's outreach component was to increase students' exposure to the science and technology of nanoscience.

  3. Development and testing of analytical models for the pebble bed type HTRs

    International Nuclear Information System (INIS)

    The pebble bed type gas cooled high temperature reactor (HTR) appears to be a good candidate for the next generation nuclear reactor technology. These reactors have unique characteristics in terms of the randomness in geometry, and require special techniques to analyze their systems. This study includes activities concerning the testing of computational tools and the qualification of models. Indeed, it is essential that the validated analytical tools be available to the research community. From this viewpoint codes like MCNP, ORIGEN and RELAP5, which have been used in nuclear industry for many years, are selected to identify and develop new capabilities needed to support HTR analysis. The geometrical model of the full reactor is obtained by using lattice and universe facilities provided by MCNP. The coupled MCNP-ORIGEN code is used to estimate the burnup and the refuelling scheme. Results obtained from Monte Carlo analysis are interfaced with RELAP5 to analyze the thermal hydraulics and safety characteristics of the reactor. New models and methodologies are developed for several past and present experimental and prototypical facilities that were based on HTR pebble bed concepts. The calculated results are compared with available experimental data and theoretical evaluations showing very good agreement. The ultimate goal of the validation of the computer codes for pebble bed HTR applications is to acquire and reinforce the capability of these general purpose computer codes for performing HTR core design and optimization studies

  4. Selection of reference standard during method development using the analytical hierarchy process.

    Science.gov (United States)

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. PMID:25636165

  5. The use of resonance scattering of capture gamma rays as an analytical tool

    International Nuclear Information System (INIS)

    The sensitivity for the resonance scattering of capture gamma rays as a tool to measure comparatively small concentrations of certain elements in bulk materials is investigated. Looking at the resonance for lead excited by iron capture gamma rays it is possible to measure concentrations down to less than 100 ppm. The advantages of the new technique are compared with other existing methods. The application of nuclear resonance scattering in prospecting for zirconium ores is emphasized

  6. Development of analytical techniques of vanadium isotope in seawater

    Science.gov (United States)

    Huang, T.; Owens, J. D.; Sarafian, A.; Sen, I. S.; Huang, K. F.; Blusztajn, J.; Nielsen, S.

    2015-12-01

    Vanadium (V) is a transition metal with isotopes of 50V and 51V, and oxidation states of +2, +3, +4 and +5. The average concentration in seawater is 1.9 ppb, which results in a marine residence time of ~50 kyrs. Its various oxidation states make it a potential tool for investigating redox conditions in the ocean and sediments due to redox related changes in the valance state of vanadium. In turn, chemical equilibrium between different oxidation states of V will likely cause isotopic fractionation that can potentially be utilized to quantify past ocean redox states. In order to apply V isotopes as a paleo-redox tracer, it is required that we know the isotopic composition of seawater and the relation to marine sources and sinks of V. We developed a novel method for pre-concentrating V and measuring the isotope ratio in seawater samples. In our method, we used four ion exchange chromatography columns to separate vanadium from seawater matrix elements, in particular titanium and chromium, which both have an isobaric interference on 50V. The first column uses the NOBIAS resin, which effectively separates V and other transition metals from the majority of seawater matrix. Subsequent columns are identical to those utilized when separating V from silicate samples (Nielsen et al, Geostand. Geoanal. Res., 2011). The isotopic composition of the purified V is measured using a Thermo Scientific Neptune multiple collector inductively coupled plasma mass spectrometer (MC-ICP-MS) in medium resolution mode. This setup resolves all molecular interferences from masses 49, 50, 51, 52 and 53 including S-O species on mass 50. To test the new method, we spiked an open ocean seawater sample from the Bermuda Atlantic Time Series (BATS) station with 10-25 μg of Alfa Aesar vanadium solution, which has an isotopic composition of δ51V = 0 [where δ51V = 1000 × [(51V/50Vsample - 51V/50VAA)/51V/50VAA]. The average of six spiked samples is -0.03±0.19‰, which is within error of the true

  7. Green certificates, a tool for market development

    International Nuclear Information System (INIS)

    To achieve a place for renewable energy the Government of the Netherlands has followed a market oriented approach. In view of the rapidly emerging liberalized energy market the government followed an approach with both support to producers and a demand-driven approach. With a fully liberalized market for green electricity with free consumer choice and the tradable certificate for renewable energy a market has been developed. In view of the slow domestic growth in production a new support mechanism was introduced called the environmental quality of power production (MEP) for renewable electricity in the Netherlands in 2003. This paper evaluates the market development over the last years with the green certificate system and the rapid growing market of green electricity. In 2004 the green certificate is EU-wide replaced by the Certificate of Origin. (author)

  8. Testing automation tools for secure software development

    OpenAIRE

    Eatinger, Christopher J.

    2007-01-01

    Software testing is a crucial step in the development of any software system, large or small. Testing can reveal the presence of logic errors and other flaws in the code that could cripple the system's effectiveness. Many flaws common in software today can also be exploited to breach the security of the system on which the software is running. These flaws can be subtle and difficult to find. Frequently it takes a combination of multiple events to bring them out. Traditional testing techni...

  9. Towards an interoperability ontology for software development tools

    OpenAIRE

    Hasni, Neji.

    2003-01-01

    Approved for public release; distribution is unlimited The automation of software development has long been a goal of software engineering to increase efficiency of the development effort and improve the software product. This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the use of these automated tools and processes. However, each software development tool has its own characteristics, semantic...

  10. Developing a coupled analytical model for analyzing salt intrusion in alluvial estuaries

    Science.gov (United States)

    Savenije, H.; CAI, H.; Gisen, J.

    2013-12-01

    A predictive assessment technique to estimate the salt intrusion length and longitudinal salinity distribution in estuaries is important for policy makers and managers to maintain a healthy estuarine environment. In this study, the salt intrusion model of Savenije (2005, 2012) is applied and coupled to an explicit solution for tidal dynamics developed by Cai and Savenije (2013). The objective of the coupling is to reduce the number of calibration parameters, which subsequently strengthens the reliability of the salt intrusion model. Moreover, the fully analytical treatment allows assessing the effect of model forcing (i.e., tide and river discharge) and geometry adjustments (e.g., by dredging) on system performance. The coupled model has been applied to a wide range of estuaries, and the result shows that the correspondence between analytical estimations and observations is very good. As a result, the coupled model is a useful tool for decision makers to obtain first order estimates of salt intrusion in estuaries based on a minimum of information required. References Savenije, H.H.G. (2005), Salinity and Tides in Alluvial Estuaries, Elsevier. Savenije, H.H.G. (2012), Salinity and Tides in Alluvial Estuaries, completely revised 2nd edition, www.salinityandtides.com. Cai, H., and H. H. G. Savenije (2013), Asymptotic behavior of tidal damping in alluvial estuaries, Journal of Geophysical Research, submitted.

  11. original: Multi-sectoral qualitative analysis: a tool for assessing the competitiveness of regions and formulating strategies for economic development

    OpenAIRE

    Brian Roberts; Stimson, Robert J

    1998-01-01

    Regional economic development strategy formulation relies heavily on analytical techniques such as shift share, location quotients, input-output and SWOT analysis. However, many of theses traditional tools are proving inadequate for understanding what makes regions competitive. New tools are required to evaluate the competitiveness of regional economies, how to gain competitive advantage, and what new management frameworks and enabling infrastructure are needed to drive economic development p...

  12. Role of analytical chemistry in the development of nuclear fuels

    International Nuclear Information System (INIS)

    Analytical chemistry is indispensable and plays a pivotal role in the entire gamut of nuclear fuel cycle activities starting from ore refining, conversion, nuclear fuel fabrication, reactor operation, nuclear fuel reprocessing to waste management. As the fuel is the most critical component of the reactor where the fissions take place to produce power, extreme care should be taken to qualify the fuel. For example, in nuclear fuel fabrication, depending upon the reactor system, selection of nuclear fuel has to be made. The fuel for thermal reactors is normally uranium oxide either natural or slightly enriched. For research reactors it can be uranium metal or alloy. The fuel for FBR can be metal, alloy, oxide, carbide or nitride. India is planning an advanced heavy water reactor for utilization of vast resources of thorium in the country. Also research is going on to identify suitable metallic/alloy fuels for our future fast reactors and possible use in fast breeder test reactor. Other advanced fuel materials are also being investigated for thermal reactors for realizing increased performance levels. For example, advanced fuels made from UO2 doped with Cr2O3 and Al2O3 are being suggested in LWR applications. These have shown to facilitate pellet densification during sintering and enlarge the pellet grain size. The chemistry of these materials has to be understood during the preparation to the stringent specification. A number of analytical parameters need to be determined as a part of chemical quality control of nuclear materials. Myriad of analytical techniques starting from the classical to sophisticated instrumentation techniques are available for this purpose. Insatiable urge of the analytical chemist enables to devise and adopt new superior methodologies in terms of reduction in the time of analysis, improvement in the measurement precision and accuracy, simplicity of the technique itself etc. Chemical quality control provides a means to ensure that the quality

  13. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  14. Developing Tool Support for Problem Diagrams with CPN and VDM++

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe ongoing work on the development of tool support for formal description of domains found in Problem Diagrams. The purpose of the tool is to handle the generation of a CPN model based on a collection of Problem Diagrams. The Problem Diagrams are used for representing the ...

  15. Evaluation and selection of CASE tool for SMART OTS development

    International Nuclear Information System (INIS)

    CASE(Computer-Aided Software Engineering) tool is a software that aids in software engineering activities such as requirement analysis, design, testing, configuration management, and project management. The evaluation and selection of commercial CASE tools for the specific software development project is not a easy work because the technical ability of an evaluator and the maturity of a software development organization are required. In this paper, we discuss selection strategies, characteristic survey, evaluation criteria, and the result of CASE tool selection for the development of SMART(System-integrated Modular Advanced ReacTor) OTS(Operator Training Simulator)

  16. Newspaper Reading among College Students in Development of Their Analytical Ability

    Science.gov (United States)

    Kumar, Dinesh

    2009-01-01

    The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…

  17. Development of a Test to Evaluate Students' Analytical Thinking Based on Fact versus Opinion Differentiation

    Science.gov (United States)

    Thaneerananon, Taveep; Triampo, Wannapong; Nokkaew, Artorn

    2016-01-01

    Nowadays, one of the biggest challenges of education in Thailand is the development and promotion of the students' thinking skills. The main purposes of this research were to develop an analytical thinking test for 6th grade students and evaluate the students' analytical thinking. The sample was composed of 3,567 6th grade students in 2014…

  18. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  19. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O' Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review

  20. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    Directory of Open Access Journals (Sweden)

    Cui Chengsong

    2015-01-01

    Full Text Available In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming process was used to produce graded tool materials in the form of a flat product. The graded deposits were subsequently hot rolled and heat treated to achieve an optimal microstructure and advanced properties. Micro plunge rotary swaging tools with fine geometrical structures were machined from the hot rolled materials. The new forming tools were successfully applied in the micro plunge rotary swaging of wires of stainless steel.

  1. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang;

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...

  2. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  3. Psychometric properties of a Mental Health Team Development Audit Tool.

    LENUS (Irish Health Repository)

    Roncalli, Silvia

    2013-02-01

    To assist in improving team working in Community Mental Health Teams (CMHTs), the Mental Health Commission formulated a user-friendly but yet-to-be validated 25-item Mental Health Team Development Audit Tool (MHDAT).

  4. Development of tools for automated physical weed control

    OpenAIRE

    Nørremark, Michael; Melander, Bo

    2009-01-01

    Tools are being developed for automated physical weed control in the close to crop area. The most promising weed control concepts are the so-called high precision tillage solutions and thermal weed control by pulsed lasers.

  5. Thermal Lens Spectroscopy as a 'new' analytical tool for actinide determination in nuclear reprocessing processes

    International Nuclear Information System (INIS)

    Thermal Lens Spectroscopy (TLS) consists of measuring the effects induced by the relaxation of molecules excited by photons. Twenty years ago, the Cea already worked on TLS. Technologic reasons impeded. But, needs in sensitive analytical methods coupled with very low sample volumes (for example, traces of Np in the COEXTM process) and also the reduction of the nuclear wastes encourage us to revisit this method thanks to the improvement of optoelectronic technologies. We can also imagine coupling TLS with micro-fluidic technologies, decreasing significantly the experiments cost. Generally two laser beams are used for TLS: one for the selective excitation by molecular absorption (inducing the thermal lens) and one for probing the thermal lens. They can be coupled with different geometries, collinear or perpendicular, depending on the application and on the laser mode. Also, many possibilities of measurement have been studied to detect the thermal lens signal: interferometry, direct intensities variations, deflection etc... In this paper, one geometrical configuration and two measurements have been theoretically evaluated. For a single photodiode detection (z-scan) the limit of detection is calculated to be near 5*10-6 mol*L-1 for Np(IV) in dodecane. (authors)

  6. Assessment procedures and analytical tools for leak-before-break applications

    International Nuclear Information System (INIS)

    Leak-before-break assessment as part of power plant pipeline strength analysis uses either the yield stress criterion or fracture-mechanical methods by the FAD concept. In thelatter case, fracture-mechanical and strength data of the material are required as well as analytical equations for calculating the stress intensity factor Kl and the plastic limiting load Lr. The application of verified and generally valid Kl and Ll solutions is of vast importance. The contribution compares selected advanced stress intensity factor solutions for cylinder with surface cracks and through cracks. Apart from the limits of application of solutions for the geometry and load parameters, also their accuracy is assessed. For this, a method for estimating numeric errors of Kl solutions is presented and is applied to a series of solutions. Equations are presented for the plastic limiting load resp. the parameter Lr. The application of the calculation methods is demonstrated for a pipeline using the current version of the failure assessment programme VERB. (orig.)

  7. Challenges in the development of analytical soil compaction models

    DEFF Research Database (Denmark)

    Keller, Thomas; Lamandé, Mathieu

    2010-01-01

    Soil compaction can cause a number of environmental and agronomic problems (e.g. flooding, erosion, leaching of agrochemicals to recipient waters, emission of greenhouse gases to the atmosphere, crop yield losses), resulting in significant economic damage to society and agriculture. Strategies and...... recommendations for the prevention of soil compaction often rely on simulation models. This paper highlights some issues that need further consideration in order to improve soil compaction modelling, with the focus on analytical models. We discuss the different issues based on comparisons between experimental...... data and model simulations. The upper model boundary condition (i.e. contact area and stresses at the tyre-soil interface) is highly influential in stress propagation, but knowledge on the effects of loading and soil conditions on the upper model boundary condition is inadequate. The accuracy of stress...

  8. Development of culturally sensitive dialog tools in diabetes education

    OpenAIRE

    Nana Folmann Hempler; Bettina Ewers

    2015-01-01

    Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptabil...

  9. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    OpenAIRE

    Cui Chengsong; Schulz Alwin; Moumi Eric; Kuhfuss Bernd; Böhmermann Florian; Riemer Oltmann

    2015-01-01

    In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming ...

  10. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  11. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  12. Disaster Risk Finance as a Tool for Development

    OpenAIRE

    World Bank Group

    2016-01-01

    Since 2013 The World Bank Group has partnered with the Global Facility for Disaster Reduction and Recovery and the U.K. Department for International Development to address some of these gaps in evidence and methodologies. The Disaster Risk Finance Impact Analytics Project has made significant contributions to the understanding of how to monitor and evaluate existing or potential investment...

  13. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    Although software usability has long been emphasized, there is a lot of software with poor usability. In Usability Engineering, usability professionals prescribe a classical usability approach to improving software usability. It is essential to prototype and usability test user interfaces before...... programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction...... cognitively simpler than the state-of-art tools. Usability test shows that VisTool is accessible to designers. Furthermore, it indicates that expert designers can do faster than with other tools. Our comparison with the traditional rapid development approach shows that VisTool reduces development time about...

  14. gOntt: a Tool for Scheduling Ontology Development Projects

    OpenAIRE

    A. GÓMEZ-PÉREZ; Suárez-Figueroa, Mari Carmen; Vigo, Martin

    2009-01-01

    The Ontology Engineering field lacks tools that guide ontology developers to plan and schedule their ontology development projects. gOntt helps ontology developers in two ways: (a) to schedule ontology projects; and (b) to execute such projects based on the schedule and using the NeOn Methodology.

  15. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP)

  16. Incremental visual text analytics of news story development

    Science.gov (United States)

    Krstajic, Milos; Najm-Araghi, Mohammad; Mansmann, Florian; Keim, Daniel A.

    2012-01-01

    Online news sources produce thousands of news articles every day, reporting on local and global real-world events. New information quickly replaces the old, making it difficult for readers to put current events in the context of the past. Additionally, the stories have very complex relationships and characteristics that are difficult to model: they can be weakly or strongly connected, or they can merge or split over time. In this paper, we present a visual analytics system for exploration of news topics in dynamic information streams, which combines interactive visualization and text mining techniques to facilitate the analysis of similar topics that split and merge over time. We employ text clustering techniques to automatically extract stories from online news streams and present a visualization that: 1) shows temporal characteristics of stories in different time frames with different level of detail; 2) allows incremental updates of the display without recalculating the visual features of the past data; 3) sorts the stories by minimizing clutter and overlap from edge crossings. By using interaction, stories can be filtered based on their duration and characteristics in order to be explored in full detail with details on demand. To demonstrate the usefulness of our system, case studies with real news data are presented and show the capabilities for detailed dynamic text stream exploration.

  17. Comprehensive analytical strategy for biomarker identification based on liquid chromatography coupled to mass spectrometry and new candidate confirmation tools.

    Science.gov (United States)

    Mohamed, Rayane; Varesio, Emmanuel; Ivosev, Gordana; Burton, Lyle; Bonner, Ron; Hopfgartner, Gérard

    2009-09-15

    A comprehensive analytical LC-MS(/MS) platform for low weight biomarkers molecule in biological fluids is described. Two complementary retention mechanisms were used in HPLC by optimizing the chromatographic conditions for a reversed-phase column and a hydrophilic interaction chromatography column. LC separation was coupled to mass spectrometry by using an electrospray ionization operating in positive polarity mode. This strategy enables us to correctly retain and separate hydrophobic as well as polar analytes. For that purpose artificial model study samples were generated with a mixture of 38 well characterized compounds likely to be present in biofluids. The set of compounds was used as a standard aqueous mixture or was spiked into urine at different concentration levels to investigate the capability of the LC-MS(/MS) platform to detect variations across biological samples. Unsupervised data analysis by principal component analysis was performed and followed by principal component variable grouping to find correlated variables. This tool allows us to distinguish three main groups whose variables belong to (a) background ions (found in all type of samples), (b) ions distinguishing urine samples from aqueous standard and blank samples, (c) ions related to the spiked compounds. Interpretation of these groups allows us to identify and eliminate isotopes, adducts, fragments, etc. and to generate a reduced list of m/z candidates. This list is then submitted to the prototype MZSearcher software tool which simultaneously searches several lists of potential metabolites extracted from metabolomics databases (e.g., KEGG, HMDB, etc) to propose biomarker candidates. Structural confirmation of these candidates was done off-line by fraction collection followed by nanoelectrospray infusion to provide high quality MS/MS data for spectral database queries. PMID:19702294

  18. Electrochemical treatment of olive mill wastewater: Treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools

    Institute of Scientific and Technical Information of China (English)

    Chokri Belaid; Moncef Khadraoui; Salma Mseddi; Monem Kallel; Boubaker Elleuch; Jean Francois Fauvarque

    2013-01-01

    Problems related with industrials effluents can be divided in two parts:(1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes.This investigation deals with these two aspects,an electrochemical treatment method of an olive mill wastewater (OMW) under pla ttmized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination.The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution.Indeed,87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation.Moreover,55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced.On the other hand,UV-Visible spectrophotometry,Gaz chromatography/mass spectrometry,cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR)showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW.It was concluded that electrochemical oxidation in a modified Gaignard reactor is a promising process for the destruction of all phenolic compounds present in OMW.Among the monitoring analytical tools applied,cyclic voltammetry and 13C NMR are among the techniques that are introduced for the first time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  19. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  20. The management and exploitation of naturally light-emitting bacteria as a flexible analytical tool: A tutorial.

    Science.gov (United States)

    Bolelli, L; Ferri, E N; Girotti, S

    2016-08-31

    Conventional detection of toxic contaminants on surfaces, in food, and in the environment takes time. Current analytical approaches to chemical detection can be of limited utility due to long detection times, high costs, and the need for a laboratory and trained personnel. A non-specific but easy, rapid, and inexpensive screening test can be useful to quickly classify a specimen as toxic or non toxic, so prompt appropriate measures can be taken, exactly where required. The bioluminescent bacteria-based tests meet all these characteristics. Bioluminescence methods are extremely attractive because of their high sensitivity, speed, ease of implementation, and statistical significance. They are usually sensitive enough to detect the majority of pollutants toxic to humans and mammals. This tutorial provides practical guidelines for isolating, cultivating, and exploiting marine bioluminescent bacteria as a simple and versatile analytical tool. Although mostly applied for aqueous phase sample and organic extracts, the test can also be conducted directly on soil and sediment samples so as to reflect the true toxicity due to the bioavailability fraction. Because tests can be performed with freeze-dried cell preparations, they could make a major contribution to field screening activity. They can be easily conducted in a mobile environmental laboratory and may be adaptable to miniaturized field instruments and field test kits. PMID:27506340

  1. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  2. New research directions in the development of analytical chemistry

    OpenAIRE

    Rema Matakova

    2016-01-01

    The article shows that discovering nanoscale elements made it possible to synthesize new chemical compounds without chemical reaction and defined the basis of effective development of nanoanalytical chemistry in the past two decades. The article focuses on the prospective development of bioanalytical chemistry, based on reagentless sensory methods of analysis of biochemical processes to cure fast dangerous infections of the century. Unusual opportunity of development of «green» chemistr...

  3. Development of an improved low profile hub seal refurbishment tool

    International Nuclear Information System (INIS)

    The hub seal area of a fuel channel feeder coupling can be exposed to oxygen in the atmosphere if protective measures are not taken during maintenance outages. Exposure to oxygen can lead to pitting of the hub seal area. Although this is a rare occurrence, the resulting possibility of the feeder coupling leakage led to the development of a feeder hub refurbishment tool. To reduce time and man-rem exposure during feeder hub seal refurbishment, an improved low profile hub seat refurbishing tool has been developed. The improved tool design will allow for quick and controlled removal of material, and the restoration of a roll-burnished finish equivalent to the original requirements. The new tool can be used in maintenance operations, with the end fitting present, as well as under retube-type circumstances, with the end fitting removed. (author)

  4. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  5. Developing e-marketing tools : Case company: CASTA Ltd.

    OpenAIRE

    Nguyen, Chi

    2014-01-01

    This Bachelor’s thesis topic is developing e-marketing tools for the B2C sector of CASTA Ltd. The final outcome is a set of online marketing tools guidelines that can improve business activities, especially marketing effectiveness. Based on the company’s status as a novice in online marketing field, the thesis will focus on the basic level of three specific online marketing tools, instead of covering the whole e-marketing subject. The theoretical framework first describes the concept of e...

  6. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    Startups play a crucial role in exploiting the commercial advantages created by new, advanced technologies. Surprisingly, the processes by which the entrepreneur commercialises these technologies are largely undescribed - partly due to the absence of appropriate process data capture tools. This...... paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  7. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    Science.gov (United States)

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  8. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  9. Development of a sustainability assessment tool for office buildings

    OpenAIRE

    Barbosa, José Amarilio; Mateus, Ricardo; Bragança, L.

    2012-01-01

    The few available sustainability assessment tools applicable in Portugal are oriented for residential buildings. Nevertheless, the impacts of office buildings have been rising mainly due to an increase in the energy consumption for cooling and heating. This way, due to the growing environmental impact of office buildings, the development of Build-ing Sustainability Assessment (BSA) tools to assess the sustainability of this type of buildings is necessary and important to guide and to boost th...

  10. China adopts rural tourism as a development tool

    OpenAIRE

    Wo, Zhuo

    2006-01-01

    In recent years, rural tourism has become ever more prominent as a tool to increase visitors' awareness and as an attraction to a destination as well as a tool for economic development in the countryside of China. Rural tourism is a new type of tourism industry, which makes rural cmmunities as its sites, rural distinctive production, living styles and idyllic landscapes as its objects. The writer aims to analyze the theory of tourism life cycle proposed by Butler, current problems, types, mod...

  11. Developing an Analytical Framework for Argumentation on Energy Consumption Issues

    Science.gov (United States)

    Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.

    2015-01-01

    In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…

  12. Knowledge-based geographic information systems (KBGIS): new analytic and data management tools

    Energy Technology Data Exchange (ETDEWEB)

    Albert, T.M.

    1988-11-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the US Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved.

  13. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    Science.gov (United States)

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  14. BAC-Dkk3-EGFP Transgenic Mouse: An In Vivo Analytical Tool for Dkk3 Expression

    Directory of Open Access Journals (Sweden)

    Yuki Muranishi

    2012-01-01

    Full Text Available Dickkopf (DKK family proteins are secreted modulators of the Wnt signaling pathway and are capable of regulating the development of many organs and tissues. We previously identified Dkk3 to be a molecule predominantly expressed in the mouse embryonic retina. However, which cell expresses Dkk3 in the developing and mature mouse retina remains to be elucidated. To examine the precise expression of the Dkk3 protein, we generated BAC-Dkk3-EGFP transgenic mice that express EGFP integrated into the Dkk3 gene in a BAC plasmid. Expression analysis using the BAC-Dkk3-EGFP transgenic mice revealed that Dkk3 is expressed in retinal progenitor cells (RPCs at embryonic stages and in Müller glial cells in the adult retina. Since Müller glial cells may play a potential role in retinal regeneration, BAC-Dkk3-EGFP mice could be useful for retinal regeneration studies.

  15. Analytical tools and functions of GIS in the process control and decision support of mining company

    OpenAIRE

    Semrád Peter

    2001-01-01

    The development of computer techniques, the increase in demands for the professional and possible fastest data processing, as well as for the fluency and efficiency of information gaining, exchanging and providing has a strong influence on forming the new generation of information technologies - Geografic Information Systems (GIS) that rose in the second half of the twentieth century.Advancement in this area is still progressing and GIS gradually find the enforcement in individual fields wher...

  16. Integrated modelling as an analytical and optimisation tool for urban watershed management.

    Science.gov (United States)

    Erbe, V; Frehmann, T; Geiger, W F; Krebs, P; Londong, J; Rosenwinkel, K H; Seggelke, K

    2002-01-01

    In recent years numerical modelling has become a standard procedure to optimise urban wastewater systems design and operation. Since the models were developed for the subsystems independently, they did not support an integrated view to the operation of the sewer system, the wastewater treatment plant (WWTP) and the receiving water. After pointing out the benefits of an integrated approach and the possible synergy effects that may arise from analysing the interactions across the interfaces, three examples of modelling case studies carried out in Germany are introduced. With these examples we intend to demonstrate the potential of integrated models, though their development cannot be considered completed. They are set up with different combinations of self-developed and commercially available software. The aim is to analyse fluxes through the total wastewater system or to integrate pollution-based control in the upstream direction, that is e.g. managing the combined water retention tanks as a function of state variables in the WWTP or the receiving water. Furthermore the interface between the sewer and the WWTP can be optimised by predictive simulations such that the combined water flow can be maximised according to the time- and dynamics-dependent state of the treatment processes. PMID:12380985

  17. Standard & Poor’s Small Business Portfolio Model introduces a potential new tool for community development loan risk analysis

    OpenAIRE

    Chen, Weili; Chang, Winston

    2007-01-01

    The Small Business Portfolio Evaluator™ analytical model helps issuers and underwriters to assess the gross default and prepayment risk of small business loan portfolios using a Monte Carlo simulation. This new tool provides an important first step to securitizing existing community development loan portfolios.

  18. Development and application of a multimedia assessment tool

    OpenAIRE

    Nixon, Daniel E.

    1997-01-01

    In the Naval Aviation community, interactive, multimedia computer based training is being explored as a cost effective alternative to traditional modes of training. This thesis develops an assessment tool for multimedia systems to be used in computer based training by combining performance recommendations for multimedia hardware and software. It delivers a checklist for multimedia developers to assess the capability of proposed multimedia systems

  19. Computer-based tools to support curriculum developers

    NARCIS (Netherlands)

    Nieveen, Nienke; Gustafson, Kent

    2000-01-01

    Since the start of the early 90’s, an increasing number of people are interested in supporting the complex tasks of the curriculum development process with computer-based tools. ‘Curriculum development’ refers to an intentional process or activity directed at (re) designing, developing and implement

  20. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  1. Analytical developments for definition and prediction of USB noise

    Science.gov (United States)

    Reddy, N. N.; Tam, C. K. W.

    1976-01-01

    A systematic acoustic data base and associated flow data are used in identifying the noise generating mechanisms of upper surface blown flap configurations of short takeoff and landing aircraft. Theory is developed for the radiated sound field of the highly sheared flow of the trailing edge wake. An empirical method is also developed using extensive experimental data and physical reasonings to predict the noise levels.

  2. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  3. Analytical Tools to Predict Distribution Outage Restoration Load. Final Project Report.

    Energy Technology Data Exchange (ETDEWEB)

    Law, John

    1994-11-14

    The main activity of this project has been twofold: (1) development of a computer model to predict CLPU(Cold Load Pickup) and (2) development of a field measurement and analysis method to obtain the input parameters of the CLPU model. The field measurement and analysis method is called the Step-Voltage-Test (STEPV). The Kootenai Electric Cooperative Appleway 51 feeder in Coeur d`Alene was selected for analysis in this project and STEPV tests were performed in winters of 92 and 93. The STEPV data was analyzed (method and results presented within this report) to obtain the Appleway 51 feeder parameters for prediction by the CLPU model. One only CLPU record was obtained in winter 1994. Unfortunately, the actual CLPU was not dramatic (short outage and moderate temperature) and did not display cyclic restoration current. A predicted Appleway 51 feeder CLPU was generated using the parameters obtained via the STEPV measurement/analysis/algorithm method at the same ambient temperature and outage duration as the measured actual CLPU. The predicted CLPU corresponds reasonably well with the single actual CLPU data obtained in winter 1994 on the Appleway 51 feeder.

  4. Analytical tools for the study of cellular glycosylation in the immune system

    Directory of Open Access Journals (Sweden)

    Yvette eVan Kooyk

    2013-12-01

    Full Text Available It is becoming increasingly clear that glycosylation plays important role in intercellular communication within the immune system. Glycosylation-dependent interactions are crucial for the innate and adaptive immune system and regulate immune cell trafficking, synapse formation, activation, and survival. These functions take place by the cis or trans interaction of lectins with glycans. Classical immunological and biochemical methods have been used for the study of lectin function; however, the investigation of their counterparts, glycans, requires very specialized methodologies that have been extensively developed in the past decade within the Glycobiology scientific community. This Mini-Review intends to summarize the available technology for the study of glycan biosynthesis, its regulation and characterization for their application to the study of glycans in Immunology.

  5. Twenty-one years of microemulsion electrokinetic chromatography (1991-2012): a powerful analytical tool.

    Science.gov (United States)

    Yang, Hua; Ding, Yao; Cao, Jun; Li, Ping

    2013-05-01

    Microemulsion electrokinetic chromatography (MEEKC) is a CE separation technique, which utilizes buffered microemulsions as the separation media. In the past two decades, MEEKC has blossomed into a powerful separation technique for the analysis of a wide range of compounds. Pseudostationary phase composition is so critical to successful resolution in EKC, and several variables could be optimized including surfactant/co-surfactant/oil type and concentration, buffer content, and pH value. Additionally, MEEKC coupled with online sample preconcentration approaches could significantly improve the detection sensitivity. This review comprehensively describes the development of MEEKC from the period 1991 to 2012. Areas covered include basic theory, microemulsion composition, improving resolution and enhancing sensitivity methods, detection techniques, and applications of MEEKC. PMID:23463608

  6. Advanced Analytical/Physics Tools to Characterize Tire Materials and Behavior

    Science.gov (United States)

    Gerspacher, Michel

    2001-10-01

    Tires are assembled with common materials like polymers, fillers, reinforcing fibers and various chemicals which are used to cure the rubber compound, and also, to protect the finished tire from oxydative degradation. This is certainly more related to chemistry than to physics. Nevertheless, a finished tire on the road is becoming a fascinating object of physics if one wants to understand its behavior. Indeed, it is its viscoelastic nature which confers to the tire its unique capabilities. The lecture will be centered on the usage of physical methods, not only to study the visco- elasticity of the composite, but also the nature of the interactions between the materials composing the tires. It will be shown that the usage of physics has tremendously helped to better understand the tire and also participated in developing new generations of tires.

  7. Analytical tools for calculating the maximum heat transfer of annular stepped fins with internal heat generation and radiation effects

    International Nuclear Information System (INIS)

    ASFs (Annular stepped fins) require less material than ADFs (annular disc fins) while retaining the ability to produce the same cooling rate in a convection environment. A simple analysis was developed for ASFs that considered radiative heat transfer and heat generated by a nuclear reactor through linearization of the radiation terms. The linearized equations were solved by exact and approximate analytical methods. Without any linearization, a new closed-form analysis was established for the temperature profile with the help of the differential transform method. An integral differential transform method was introduced to determine the actual heat-transfer rate when heat was generated inside an ASF under nonlinear radiation surface conditions. The temperature results obtained using this analytical approach were compared with those obtained from a finite-difference analysis, and were in excellent agreement. The fin performance was defined as a function of the heat generated for a given set of design conditions. An optimization study with varying heat generation was carried out to compare the performance of ADFs and ASFs, which highlighted the superior aspects of an annular fin design. - Highlights: • A new analytical model for ASF under heat generation and radiation heat transfer. • Exact and approximate analysis for linearization of governing equation. • A high accuracy obtained from approximate analysis. • Fin heat transfer for nonlinear surface conditions and heat generation. • Calculation of maximum heat transfer as a function of heat generation parameter

  8. Integration of Environmental Analytical Chemistry with Environmental Law: The Development of a Problem-Based Laboratory.

    Science.gov (United States)

    Cancilla, Devon A.

    2001-01-01

    Introduces an undergraduate level problem-based analytical chemistry laboratory course integrated with an environmental law course. Aims to develop an understanding among students on the use of environmental indicators for environmental evaluation. (Contains 30 references.) (YDS)

  9. Implementing WAI Authoring Tool Accessibility Guidelines in Developing Adaptive Elearning

    Directory of Open Access Journals (Sweden)

    Mahieddine Djoudi

    2012-09-01

    Full Text Available Adaptive learning technology allows for the development of more personalized online learning experiences with materials that adapt to student performance and skill level. The term “adaptive” is also used to describe Assistive Technologies that allow the usability of online based courses for learners with disabilities and special needs. Authoring tools can enable, encourage, and assist authors in the creation of elearning content. Because most of the content of the Web based adaptive learning is created using authoring tools, they may be accessible to authors regardless of disability and they may support and encourage the authors in creating accessible elearning content. This paper presents an authoring tool designed for developing accessible adaptive elearning. The authoring tool, dedicated to Algerian universities, is designed to satisfy the W3C/WAI Authoring Tool Accessibility Guidelines (ATAG, and to allow collaboration functionalities among teachers where building elearning courses. After presenting the W3C/WAI accessibility guidelines, the collaborative authoring tool is outlined.

  10. Filmes de metal-hexacianoferrato: uma ferramenta em química analítica Metal-hexacyanoferrate films: a tool in analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Ivanildo Luiz de Mattos

    2001-04-01

    Full Text Available Chemically modified electrodes based on hexacyanometalate films are presented as a tool in analytical chemistry. Use of amperometric sensors and/or biosensors based on the metal-hexacyanoferrate films is a tendency. This article reviews some applications of these films for analytical determination of both inorganic (e.g. As3+, S2O3(2- and organic (e.g. cysteine, hydrazine, ascorbic acid, gluthatione, glucose, etc. compounds.

  11. Development and Validation of HPLC Methods for Analytical and Preparative Purposes

    OpenAIRE

    Lindholm, Johan

    2004-01-01

    This thesis concerns the development and validation of high performance liquid chromatography (HPLC) methods aimed for two industrially important areas: (i) analysis of biotechnological synthesis and (ii) determination of adsorption isotherm parameters. There is today a lack of detailed recommendations for analytical procedures in the field of biotechnological production of drugs. Therefore, guidelines were given for analytical development and validation in this field; the production of 9α-hy...

  12. On the Development of Parameterized Linear Analytical Longitudinal Airship Models

    Science.gov (United States)

    Kulczycki, Eric A.; Johnson, Joseph R.; Bayard, David S.; Elfes, Alberto; Quadrelli, Marco B.

    2008-01-01

    In order to explore Titan, a moon of Saturn, airships must be able to traverse the atmosphere autonomously. To achieve this, an accurate model and accurate control of the vehicle must be developed so that it is understood how the airship will react to specific sets of control inputs. This paper explains how longitudinal aircraft stability derivatives can be used with airship parameters to create a linear model of the airship solely by combining geometric and aerodynamic airship data. This method does not require system identification of the vehicle. All of the required data can be derived from computational fluid dynamics and wind tunnel testing. This alternate method of developing dynamic airship models will reduce time and cost. Results are compared to other stable airship dynamic models to validate the methods. Future work will address a lateral airship model using the same methods.

  13. The simple analytics of oligopoly banking in developing economies

    OpenAIRE

    Khemraj, Tarron

    2010-01-01

    Previous studies have documented the tendency for the commercial banking sector of many developing economies to be highly liquid and be characterised by a persistently high interest rate spread. This paper embeds these stylised facts in an oligopoly model of the banking firm. The paper derives both the loan and deposit rates as a mark up rate over a relatively safe foreign interest rate. Then, using a diagrammatic framework, the paper provides an analysis of: (i) the distribution of financ...

  14. AN ANALYTICAL STUDY ON HUMAN RESOURCE DEVELOPMENT AND EDUCATION

    OpenAIRE

    Rajkumar Rathod

    2014-01-01

    Education is a continuous and never ending process. The concept “Human Resource Development” is of high value in business and industry and has been used and applied since years. In industry and business the human element is considered as a resource and hence its development and protection is very essential and inevitable. Of all the factors of production, human resource is the only factor having rational faculty and therefore, it must be handled with utmost care. Right recruitm...

  15. Using competences and competence tools in workforce development.

    Science.gov (United States)

    Green, Tess; Dickerson, Claire; Blass, Eddie

    The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs. PMID:21072016

  16. Human Rights as a Tool for Sustainable Development

    OpenAIRE

    Manuel Couret Branco; Pedro Damião Henriques

    2009-01-01

    In poor as much as in rich countries there is a fear that environmentally sustainable development might be contradictory to development in general and equitable development in particular. There could be indeed a contradiction between environmental and social sustainability, too much care for the environment eventually leading to forgetting about the people. The purpose of this paper is to explore institutional principles and tools that allow the conciliation between environmental and social s...

  17. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  18. Work and Learner Identity -Developing an analytical framework

    DEFF Research Database (Denmark)

    Kondrup, Sissel

    The paper address the need to develop a theoretical framework able to grasp how engagement in work form certain conditions for workers to meet the obligation to form a pro-active learner identity, position themselves as educable subjects and engage in lifelong learning. An obligation that has......, their life situation and how they formulate their life strategies e.g. how they orientate toward different learning activities and form certain learner identities. The paper outline how the relation between work and identity can be conceptualised and provide a theoretical framework enabling researchers...

  19. The Development of a Tool for Sustainable Building Design:

    DEFF Research Database (Denmark)

    Tine Ring Hansen, Hanne; Knudstrup, Mary-Ann

    2009-01-01

    sustainable buildings, as well as, an analysis of the relationship between the different approaches (e.g. low-energy, environmental, green building, solar architecture, bio-climatic architecture etc.) to sustainable building design and these indicators. The paper furthermore discusses how sustainable......The understanding of sustainable building has changed over time along with the architectural interpretation of sustainability. The paper presents the results of a comparative analysis of the indicators found in different internationally acclaimed and Danish certification schemes and standards for...... architecture will gain more focus in the coming years, thus, establishing the need for the development of a new tool and methodology, The paper furthermore describes the background and considerations involved in the development of a design support tool for sustainable building design. A tool which considers...

  20. Development of culturally sensitive dialog tools in diabetes education

    Directory of Open Access Journals (Sweden)

    Nana Folmann Hempler

    2015-01-01

    Full Text Available Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use.

  1. Development of culturally sensitive dialog tools in diabetes education.

    Science.gov (United States)

    Hempler, Nana Folmann; Ewers, Bettina

    2015-01-01

    Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use. PMID:25593850

  2. Development of a green remediation tool in Japan.

    Science.gov (United States)

    Yasutaka, Tetsuo; Zhang, Hong; Murayama, Koki; Hama, Yoshihito; Tsukada, Yasuhisa; Furukawa, Yasuhide

    2016-09-01

    The green remediation assessment tool for Japan (GRATJ) presented in this study is a spreadsheet-based software package developed to facilitate comparisons of the environmental impacts associated with various countermeasures against contaminated soil in Japan. This tool uses a life-cycle assessment-based model to calculate inventory inputs/outputs throughout the activity life cycle during remediation. Processes of 14 remediation methods for heavy metal contamination and 12 for volatile organic compound contamination are built into the tool. This tool can evaluate 130 inventory inputs/outputs and easily integrate those inputs/outputs into 9 impact categories, 4 integrated endpoints, and 1 index. Comparative studies can be performed by entering basic data associated with a target site. The integrated results can be presented in a simpler and clearer manner than the results of an inventory analysis. As a case study, an arsenic-contaminated soil remediation site was examined using this tool. Results showed that the integrated environmental impacts were greater with onsite remediation methods than with offsite ones. Furthermore, the contributions of CO2 to global warming, SO2 to urban air pollution, and crude oil to resource consumption were greater than other inventory inputs/outputs. The GRATJ has the potential to improve green remediation and can serve as a valuable tool for decision makers and practitioners in selecting countermeasures in Japan. PMID:26803220

  3. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    Science.gov (United States)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  4. Quality management in development of hard coatings on cutting tools

    Directory of Open Access Journals (Sweden)

    M. Soković

    2007-09-01

    Full Text Available Purpose: In this paper, an attempt is made to establish the general model of quality management also to the field of development and introducing of hard coatings on cutting tools.Design/methodology/approach: The conventional PVD and CVD methods have its limitations and that innovative processes are essential within the framework of an environmentally oriented quality management system. Meeting the requirements of ISO 9000 and ISO 14000 standards, the proposed model ensures the fulfilment of the basic requirements leading to the required quality of preparation processes and the quality of end products (hard coatings.Findings: One of the main pre-requisites for successful industrial production is the use of quality coated cutting tools with defined mechanical and technological properties. Therefore, for the development and introduction of new coated cutting tool (new combination of cutting material and hard coatings, it is necessary to carry out a number of studies with the purpose to optimize the coatings composition and processing procedures, and also to test new tools in working conditions.Research limitations/implications: The requirements from industry: produce faster, better, safety and more ecologically, force us to develop new effective tools and innovative technologies. This provides a technological challenge to the scientists and engineers and increases the importance of knowing several scientific disciplines.Practical implications: The quality of a company’s product directly affects its competitive position, profitability and credibility in the market. Quality management system must undergo a process of continuous improvement, which extends from the deployment of preventive quality assurance methods to the application of closed loop quality circuits.Originality/value: Design of the original and structured model of quality management system for successful development, producing and involving of new coated tools in the practice.

  5. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining

    OpenAIRE

    Sivachenko Andrey Y; Huan Tianxiao; Harrison Scott H; Chen Jake Y

    2008-01-01

    Abstract Background New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of c...

  6. Assessment Tool Development for Extracurricular Smet Programs for Girls

    Science.gov (United States)

    House, Jody; Johnson, Molly; Borthwick, Geoffrey

    Many different programs have been designed to increase girls' interest in and exposure to science, mathematics, engineering, and technology (SMET). Two of these programs are discussed and contrasted in the dimensions of length, level of science content, pedagogical approach, degree of self- vs. parent-selected participants, and amount of communitybuilding content. Two different evaluation tools were used. For one program, a modified version of the University of Pittsburgh's undergraduate engineering attitude assessment survey was used. Program participants' responses were compared to those from a fifth grade, mixed-sex science class. The only gender difference found was in the area of parental encouragement. The girls in the special class were more encouraged to participate in SMET areas. For the second program, a new age-appropriate tool developed specifically for these types of programs was used, and the tool itself was evaluated. The results indicate that the new tool has construct validity. On the basis of these preliminary results, a long-term plan for the continued development of the assessment tool is outlined.

  7. Developing shape analysis tools to assist complex spatial decision making

    Energy Technology Data Exchange (ETDEWEB)

    Mackey, H.E. [Westinghouse Savannah River Company, AIKEN, SC (United States); Ehler, G.B.; Cowen, D. [South Carolina Univ., Columbia, SC (United States)

    1996-05-31

    The objective of this research was to develop and implement a shape identification measure within a geographic information system, specifically one that incorporates analytical modeling for site location planning. The application that was developed incorporated a location model within a raster-based GIS, which helped address critical performance issues for the decision support system. Binary matrices, which approximate the object`s geometrical form, are passed over the grided data structure and allow identification of irregular and regularly shaped objects. Lastly, the issue of shape rotation is addressed and is resolved by constructing unique matrices corresponding to the object`s orientation

  8. Developing shape analysis tools to assist complex spatial decision making

    International Nuclear Information System (INIS)

    The objective of this research was to develop and implement a shape identification measure within a geographic information system, specifically one that incorporates analytical modeling for site location planning. The application that was developed incorporated a location model within a raster-based GIS, which helped address critical performance issues for the decision support system. Binary matrices, which approximate the object's geometrical form, are passed over the grided data structure and allow identification of irregular and regularly shaped objects. Lastly, the issue of shape rotation is addressed and is resolved by constructing unique matrices corresponding to the object's orientation

  9. Enterprise Portal Development Tools: Problem-Oriented Approach

    OpenAIRE

    Zykov, Sergey V.

    2006-01-01

    The paper deals with problem-oriented visual information system (IS) engineering for enterprise Internet-based applications, which is a vital part of the whole development process. The suggested approach is based on semantic network theory and a novel ConceptModeller CASE tool.

  10. iMarine - Applications and tools development plan

    OpenAIRE

    Ellenbroek, Anton; Candela, Leonardo

    2012-01-01

    This report documents the strategy and plan leading to the development of specific applications and tools that in tandem with the rest of gCube technology will be used to realize the Virtual Research Environments that are expected to serve the needs of the Ecosystem Approach Community of Practice.

  11. Engineer develops DynaPro, a production planning tool

    OpenAIRE

    Virginia Tech News

    2005-01-01

    Manufacturers have long been plagued with planning problems related to production and inventory decisions, labor requirements and capacity adjustments. DynaPro, a new software tool developed by Subhash Sarin, an engineering faculty member at Virginia Tech's Center for High Performance Manufacturing (CHPM), could help manufacturers make those types of decisions.

  12. ENVIRONMENTAL ACCOUNTING: A MANAGEMENT TOOL FOR SUSTAINABLE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Nicolae Virag

    2014-12-01

    Full Text Available The paper aims to analyze the ways in which accounting as a social science and management information tool can contribute to sustainable development. The paper highlights the emergence of the environmental accounting concept, the applicability of the environmental accounting, types of environmental accounting, scope and benefits of environmental accounting.

  13. Developing an Intranet: Tool Selection and Management Issues.

    Science.gov (United States)

    Chou, David C.

    1998-01-01

    Moving corporate systems onto an intranet will increase the data traffic within the corporate network, which necessitates a high-quality management process to the intranet. Discusses costs and benefits of adopting an intranet, tool availability and selection criteria, and management issues for developing an intranet. (Author/AEF)

  14. DEVELOPING A TOOL FOR ENVIRONMENTALLY PREFERABLE PURCHASING: JOURNAL ARTICLE

    Science.gov (United States)

    NRMRL-CIN-1246 Curran*, M.A. Developing a Tool for Environmentally Preferable Purchasing. Environmental Management and Health (Filho, W.L. (Ed.), MCB University Press) 12 (3):244-253 (2001). EPA/600/J-02/238, http://www.emerald-library.com/ft. 12/04/2000 LCA-based guidance wa...

  15. ENVIRONMENTAL ACCOUNTING: A MANAGEMENT TOOL FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Nicolae Virag; Dorel Mates; Doru Ioan Ardelean; Claudiu Gheorghe Feies

    2014-01-01

    The paper aims to analyze the ways in which accounting as a social science and management information tool can contribute to sustainable development. The paper highlights the emergence of the environmental accounting concept, the applicability of the environmental accounting, types of environmental accounting, scope and benefits of environmental accounting.

  16. Developing a Decision Support System: The Software and Hardware Tools.

    Science.gov (United States)

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  17. Hyperdev: Hypertext tool to support object-oriented software development

    International Nuclear Information System (INIS)

    The authors propose a software tool, based on hypertext techniques, to support the object-oriented development of scientific applications. Within HyperDev, all kinds of software information such as plain text, formatted text, graphics and code are connected through links allowing for different views of the same object and, consequently, achieving a better understanding of the software components

  18. Method and Tools for Development of Advanced Instructional Systems

    NARCIS (Netherlands)

    Arend, J. van der; Riemersma, J.B.J.

    1994-01-01

    The application of advanced instructional systems (AISs), like computer-based training systems, intelligent tutoring systems and training simulators, is widely spread within the Royal Netherlands Army. As a consequence there is a growing interest in methods and tools to develop effective and efficie

  19. Developing mobile educational apps: development strategies, tools and business models

    Directory of Open Access Journals (Sweden)

    Serena Pastore

    Full Text Available The mobile world is a growing and evolving market in all its aspects from hardware, networks, operating systems and applications. Mobile applications or apps are becoming the new frontier of software development, since actual digital users use mobile devi ...

  20. Review of the Development of Learning Analytics Applied in College-Level Institutes

    Directory of Open Access Journals (Sweden)

    Ken-Zen Chen

    2014-07-01

    Full Text Available This article focuses on the recent development of Learning Analytics using higher education institutional big-data. It addresses current state of Learning Analytics, creates a shared understanding, and clarifies misconceptions about the field. This article also reviews prominent examples from peer institutions that are conducting analytics, identifies their data and methodological framework, and comments on market vendors and non-for-profit initiatives. Finally, it suggests an implementation agenda for potential institutions and their stakeholders by drafting necessary preparations and creating iterative implementation flows.

  1. Recent developments in radio-analytic methods for radiation protection. A resume

    International Nuclear Information System (INIS)

    The review covers recent developments in radio-analytic methods for radiation protection. Te validation and verification of radio-analytic methodologies for radionuclide determination in a variety of matrices is related to the appropriate sample preparation. Modern techniques are described from rotation evaporators, microwave systems to automated separation columns. New testing technologies allow the extension of detectable radionuclides and to reduce the detection limit significantly. These techniques are of importance for incorporation surveillance purposes but also for emergency management in case of radiation accidents. The modern radio-analytical techniques allow a sophisticated quality assurance in the frame of international cooperation and projects. A specific contribution covers the capability of alpha spectrometry.

  2. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  3. Development of the Operational Events Groups Ranking Tool

    International Nuclear Information System (INIS)

    Both because of complexity and ageing, facilities like nuclear power plants require feedback from the operating experience in order to further improve safety and operation performance. That is the reason why significant effort is dedicated to operating experience feedback. This paper contains description of the specification and development of the application for the operating events ranking software tool. Robust and consistent way of selecting most important events for detail investigation is important because it is not feasible or even useful to investigate all of them. Development of the tool is based on the comprehensive events characterisation and methodical prioritization. This includes rich set of events parameters which allow their top level preliminary analysis, different ways of groupings and even to evaluate uncertainty propagation to the ranking results. One distinct feature of the implemented method is that user (i.e., expert) could determine how important is particular ranking parameter based on their pairwise comparison. For tools demonstration and usability it is crucial that sample database is also created. For useful analysis the whole set of events for 5 years is selected and characterised. Based on the preliminary results this tool seems valuable for new preliminary prospective on data as whole, and especially for the identification of events groups which should have priority in the more detailed assessment. The results are consisting of different informative views on the events groups importance and related sensitivity and uncertainty results. This presents valuable tool for improving overall picture about specific operating experience and also for helping identify the most important events groups for further assessment. It is clear that completeness and consistency of the input data characterisation is very important to get full and valuable importance ranking. Method and tool development described in this paper is part of continuous effort of

  4. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  5. The development, qualification and availability of AECL analytical, scientific and design codes

    International Nuclear Information System (INIS)

    Over the past several years, AECL has embarked on a comprehensive program to develop, qualify and support its key safety and licensing codes, and to make executable versions of these codes available to the international nuclear community. To this end, we have instituted a company-wide Software Quality Assurance (SQA) Program for Analytical, Scientific and Design Computer Programs to ensure that the design, development, maintenance, modification, procurement and use of computer codes within AECL is consistent with today's quality assurance standards. In addition, we have established a comprehensive Code Validation Project (CVP) with the goal of qualifying AECL's 'front-line' safety and licensing codes by 2001 December. The outcome of this initiative will be qualified codes, which are properly verified and validated for the expected range of applications, with associated statements of accuracy and uncertainty for each application. The code qualification program, based on the CSA N286.7 standard, is intended to ensure (1) that errors are not introduced into safety analyses because of deficiencies in the software, (2) that an auditable documentation base is assembled that demonstrates to the regulator that the codes are of acceptable quality, and (3) that these codes are formally qualified for their intended applications. Because AECL and the Canadian nuclear utilities (i.e., Ontario Power Generation, Bruce Power, Hydro Quebec and New Brunswick Power) generally use the same safety and licensing codes, the nuclear industry in Canada has agreed to work cooperatively together towards the development, qualification and maintenance of a common set of analysis tools, referred to as the Industry Standard Toolset (IST). This paper provides an overview of the AECL Software Quality Assurance Program and the Code Validation Project, and their associated linkages to the Canadian nuclear community's Industry Standard Toolset initiative to cooperatively qualify and support commonly

  6. XSC plasma control: Tool development for the session leader

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosino, G. [Associazione Euratom-ENEA-CREATE, University Napoli Federico II (Italy)]. E-mail: ambrosin@unina.it; Albanese, R. [Associazione Euratom-ENEA-CREATE, University Mediterranea Reggio Calabria (Italy); Ariola, M. [Associazione Euratom-ENEA-CREATE, University Napoli Federico II (Italy); Cenedese, A. [Associazione Euratom-ENEA-Consorzio RFX (Italy); Crisanti, F. [Associazione Euratom-ENEA-Frascati (Italy); Tommasi, G. De [Associazione Euratom-ENEA-CREATE, University Napoli Federico II (Italy); Mattei, M. [Associazione Euratom-ENEA-CREATE, University Mediterranea Reggio Calabria (Italy); Piccolo, F. [EURATOM-UKAEA Fusion Association (United Kingdom); Pironti, A. [Associazione Euratom-ENEA-CREATE, University Napoli Federico II (Italy); Sartori, F. [EURATOM-UKAEA Fusion Association (United Kingdom); Villone, F. [Associazione Euratom-ENEA-CREATE, University Cassino (Italy)

    2005-11-15

    A new model-based shape controller (XSC, i.e., eXtreme Shape Controller) able to operate with high elongation and triangularity plasmas has been designed and implemented at JET in 2003. The use of the XSC needs a number of steps, which at present are not automated and therefore imply the involvement of several experts. To help the session leader in preparing an experiment, a number of software tools are needed. The paper describes the SW tools that are currently in the developing phase, and describes the new framework for the preparation of a JET experiment.

  7. XSC plasma control: Tool development for the session leader

    International Nuclear Information System (INIS)

    A new model-based shape controller (XSC, i.e., eXtreme Shape Controller) able to operate with high elongation and triangularity plasmas has been designed and implemented at JET in 2003. The use of the XSC needs a number of steps, which at present are not automated and therefore imply the involvement of several experts. To help the session leader in preparing an experiment, a number of software tools are needed. The paper describes the SW tools that are currently in the developing phase, and describes the new framework for the preparation of a JET experiment

  8. Tool for test driven development of JavaScript applications

    OpenAIRE

    Stamać, Gregor

    2015-01-01

    Thesis describes the implementation of a tool for testing JavaScript code. The tool is designed to help us in test-driven development of JavaScript-based applications. Therefore, it is important to display test results as quickly as possible. The thesis is divided into four parts. First part describes JavaScript environment. It contains a brief history of the JavaScript language, prevalence, strengths and weaknesses. This section also describes TypeScript programming language that is a super...

  9. Developing a financial simulation tool as a web application

    OpenAIRE

    Neupane, Suraj

    2015-01-01

    “Kunnan Taitoa Oy”, a Finnish municipal financial expert, commissioned to upgrade its financial simulation tool from its current spreadsheet status to a web application. The principles of Open source served as the foundation of software development for a team of Haaga-Helia students who participated in the project ‘Taitoa’. The project aimed to deliver the working version of the web application. This thesis documents the process of application development and the thesis itself is a project-b...

  10. Fiscal Policy as a Tool for Stabilization in Developing Countries

    OpenAIRE

    Kraay, Aart; Serven, Luis

    2013-01-01

    The financial crisis of 2007/2008, the subsequent great recession in rich countries and its propagation to developing countries has sparked a renewed interest in the role of fiscal policy as a potential countercyclical tool among policymakers and researchers. This paper reviews the state of empirical evidence on the effectiveness of discretionary countercyclical fiscal policy by placing a particular emphasis on developing countries. On the whole, successful fiscal interventions of this type h...

  11. Solid-phase development of a L-hydroxybenzotriazole linker for heterocycle synthesis using analytical constructs.

    Science.gov (United States)

    Scicinski, J J; Congreve, M S; Jamieson, C; Ley, S V; Newman, E S; Vinader, V M; Carr, R A

    2001-01-01

    The development of a 1-hydroxybenzotriazole linker for the synthesis of heterocyclic derivatives is described, utilizing analytical construct methodology to facilitate the analysis of resin samples. A UV-chromophore-containing analytical construct enabled the accurate determination of resin loading and the automated monitoring of key reactions using only small quantities of resin. The syntheses of an array of isoxazole derivatives are reported. PMID:11442396

  12. Development of novel analytical methods to study the metabolism of coumarin

    OpenAIRE

    Deasy, Brian

    1996-01-01

    The research in this thesis revolves around developing analytical methods for the determination of coumann and 7-hydroxycoumann for various applications. The techniques used in this work were, capillary electrophoresis, immunosensing and electrochemistry. Chapter 1 serves as general review of the analysis of coumann and 7-hydroxycoumann, including the many different types of analytical technique which have been used to analyse this drug. Capillary electrophoresis was used as the basis of a me...

  13. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    Energy Technology Data Exchange (ETDEWEB)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg

  14. Searching for Sentient Design Tools for Game Development

    DEFF Research Database (Denmark)

    Liapis, Antonios

    a large volume of game content or to reduce designer effort by automating the mechanizable aspects of content creation, such as feasibility checking. However elaborate the type of content such tools can create, they remain subservient to their human developers/creators (who have tightly designed all...... their generative algorithms) and to their human users (who must take all design decisions), respectively. This thesis argues that computers can be creative partners to human designers rather than mere slaves; game design tools can be aware of designer intentions, preferences and routines, and can accommodate them...... or even subvert them. This thesis presents Sentient Sketchbook, a tool for designing game level abstractions of different game genres, which assists the level designer as it automatically tests maps for playability constraints, evaluates and displays the map's gameplay properties and creates alternatives...

  15. Development of Interpretive Simulation Tool for the Proton Radiography Technique

    CERN Document Server

    Levy, M C; Wilks, S C; Ross, J S; Huntington, C M; Fiuza, F; Baring, M G; Park, H- S

    2014-01-01

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from PIC or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field `primitives' is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagneti...

  16. Ongoing development of digital radiotherapy plan review tools

    International Nuclear Information System (INIS)

    Full text: To describe ongoing development of software to support the review of radiotherapy treatment planning system (TPS) data. The 'SWAN' software program was conceived in 2000 and initially developed for the RADAR (TROG 03.04) prostate radiotherapy trial. Validation of the SWAN program has been occurring via implementation by TROG in support of multiple clinical trials. Development has continued and the SWAN software program is now supported by modular components which comprise the 'SW AN system'. This provides a comprehensive set of tools for the review, analysis and archive of TPS exports. The SWAN system has now been used in support of over 20 radiotherapy trials and to review the plans of over 2,000 trial participants. The use of the system for the RADAR trial is now culminating in the derivation of dose-outcomes indices for prostate treatment toxicity. Newly developed SWAN tools include enhanced remote data archive/retrieval, display of dose in both relative and absolute modes, and interfacing to a Matlab-based add-on ('VAST') that allows quantitative analysis of delineated volumes including regional overlap statistics for multi-observer studies. Efforts are continuing to develop the SWAN system in the context of international collaboration aimed at harmonising the quality-assurance activities of collaborative trials groups. Tools such as the SWAN system are essential for ensuring the collection of accurate and reliable evidence to guide future radiotherapy treatments. One of the principal challenges of developing such a tool is establishing a development path that will ensure its validity and applicability well into the future.

  17. National Energy Audit Tool for Multifamily Buildings Development Plan

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, Mini [ORNL; MacDonald, Michael [Sentech, Inc.; Accawi, Gina K [ORNL; New, Joshua Ryan [ORNL; Im, Piljae [ORNL

    2012-03-01

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherization of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional

  18. Analytical QbD: development of a native gel electrophoresis method for measurement of monoclonal antibody aggregates.

    Science.gov (United States)

    Pathak, Mili; Dutta, Debayon; Rathore, Anurag

    2014-08-01

    This paper presents a quality by design (QbD) based development of a novel native PAGE (N-PAGE) method as a low-cost analytical tool for analysis of aggregates of monoclonal antibodies. Comparability to the present gold standard of SEC has been established. The motivation is the fact that SEC requires relatively expensive equipment and consumables, thus making N-PAGE relevant to those academicians and other small companies involved in early-stage development of biotherapeutics that do not have access to SEC, especially in developing countries. Furthermore, SEC suffers from certain disadvantages including the possibility of secondary interactions between the stationary phase and analyte resulting in higher elution time and therefore underestimation of the analyte size. The proposed N-PAGE method can also serve as an orthogonal analytical method for aggregate analysis. A QbD-based approach has been used for development and optimization of the protocol. First, initial screening studies were carried out with parameters including the running buffer pH, running buffer molarity, gel buffer pH, loading dye, sample concentration, and running voltage. Next, optimization of operating parameters was performed using principles of design of experiments. The final optimized protocol was compared to the traditional SEC method and the results were found to be comparable. While N-PAGE has been in use for protein analysis for several decades, use of N-PAGE for analysis of mAb aggregates with data comparable to SEC such as the case presented here is novel. PMID:24643784

  19. Development of cutting tools for the dismantling of nuclear facilities

    International Nuclear Information System (INIS)

    The purpose of this study, carried cut by the Atomic Energy Commission - Nuclear Installations Dismantling Unit (CEA-UDIN), is to test cutting apparatuses and tools suitable for use in dismantling operations on CEA sites and to do such development work as is necessary to make these apparatus and tools more efficient and better adapted to remote control operation. The work, carried out over the last three years, can be split into two categories: comparative trials of five tools carried out in standardized inactive conditions: (hacksaw) recuperating saw, grinder, plasma torch, arc air and arc saw. Comparisons have been made of performances and the production of secondary wastes (in mass and grain sizes). Improvements envisaged concern, mainly, the plasma torch and the arc saw. Development of two tools for concrete cutting: a diamond saw used in the stage 3 dismantling of the ATI installation at La Hague for the remote controlled cutting of a 200 mm thick reinforced concrete wall. The work completed with good results. A shot blasting machine intended for the decontamination of plain or resin coated concrete walls and of stainless steel cladding of rooms in the AT1 installation. The active trials have proved positive. 17 figs, 9 tabs

  20. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    International Nuclear Information System (INIS)

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work

  1. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  2. Microsystem design framework based on tool adaptations and library developments

    Science.gov (United States)

    Karam, Jean Michel; Courtois, Bernard; Rencz, Marta; Poppe, Andras; Szekely, Vladimir

    1996-09-01

    Besides foundry facilities, Computer-Aided Design (CAD) tools are also required to move microsystems from research prototypes to an industrial market. This paper describes a Computer-Aided-Design Framework for microsystems, based on selected existing software packages adapted and extended for microsystem technology, assembled with libraries where models are available in the form of standard cells described at different levels (symbolic, system/behavioral, layout). In microelectronics, CAD has already attained a highly sophisticated and professional level, where complete fabrication sequences are simulated and the device and system operation is completely tested before manufacturing. In comparison, the art of microsystem design and modelling is still in its infancy. However, at least for the numerical simulation of the operation of single microsystem components, such as mechanical resonators, thermo-elements, elastic diaphragms, reliable simulation tools are available. For the different engineering disciplines (like electronics, mechanics, optics, etc) a lot of CAD-tools for the design, simulation and verification of specific devices are available, but there is no CAD-environment within which we could perform a (micro-)system simulation due to the different nature of the devices. In general there are two different approaches to overcome this limitation: the first possibility would be to develop a new framework tailored for microsystem-engineering. The second approach, much more realistic, would be to use the existing CAD-tools which contain the most promising features, and to extend these tools so that they can be used for the simulation and verification of microsystems and of the devices involved. These tools are assembled with libraries in a microsystem design environment allowing a continuous design flow. The approach is driven by the wish to make microsystems accessible to a large community of people, including SMEs and non-specialized academic institutions.

  3. The development and application of advanced analytical methods to commercial ICF reactor chambers. Final report

    International Nuclear Information System (INIS)

    Progress is summarized in this report for each of the following tasks: (1) multi-dimensional radiation hydrodynamics computer code development; (2) 2D radiation-hydrodynamic code development; (3) ALARA: analytic and Laplacian adaptive radioactivity analysis -- a complete package for analysis of induced activation; (4) structural dynamics modeling of ICF reactor chambers; and (5) analysis of self-consistent target chamber clearing

  4. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    International Nuclear Information System (INIS)

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY)

  5. The development and application of advanced analytical methods to commercial ICF reactor chambers. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cousseau, P.; Engelstad, R.; Henderson, D.L. [and others

    1997-10-01

    Progress is summarized in this report for each of the following tasks: (1) multi-dimensional radiation hydrodynamics computer code development; (2) 2D radiation-hydrodynamic code development; (3) ALARA: analytic and Laplacian adaptive radioactivity analysis -- a complete package for analysis of induced activation; (4) structural dynamics modeling of ICF reactor chambers; and (5) analysis of self-consistent target chamber clearing.

  6. ANALYTICAL, CRITICAL AND CREATIVE THINKING DEVELOPMENT OF THE GIFTED CHILDREN IN THE USA SCHOOLS

    Directory of Open Access Journals (Sweden)

    Anna Yurievna Kuvarzina

    2013-11-01

    Full Text Available Teachers of the gifted students should not only make an enrichment and acceleration program for them but also pay attention to the development of analytical, critical and creative thinking skills. Despite great interest for this issue in the last years, the topic of analytical and creative thinking is poorly considered in the textbooks for the gifted. In this article some methods, materials and programs of analytical, critical and creative thinking skills development, which are used in the USA, are described.  The author analyses and systematize the methods and also suggests some ways of their usage in the Russian educational system.Purpose: to analyze and systematize methods, materials and programs, that are used in the USA for teaching gifted children analytical, critical and creative thinking, for development of their capacities of problem-solving and decision-making. Methods and methodology of the research: analysis, comparison, principle of the historical and logical approaches unity.Results: positive results of employment of analytical, critical and creative thinking development methods were shown in the practical experience of teaching and educating gifted children in the USA educational system.Results employment field: the Russian Federation educational system: schools, special classes and courses for the gifted children.DOI: http://dx.doi.org/10.12731/2218-7405-2013-7-42

  7. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    Science.gov (United States)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  8. Patient safety culture improvement tool: development and guidelines for use.

    Science.gov (United States)

    Fleming, Mark; Wentzell, Natasha

    2008-01-01

    The Patient Safety Culture Improvement Tool (PSCIT) was developed to assist healthcare organizations in identifying practical actions to improve their culture. This article describes the development process of the PSCIT and provides a guide to using the PSCIT. The tool is based on a safety culture maturity model, which describes five stages of cultural evolution, from pathological to generative. The PSCIT consists of nine elements that cover five patient safety culture dimensions, namely, leadership, risk analysis, workload management, sharing and learning and resource management. Each element describes the systems in place at each level of maturity, enabling organizations to identify their current level of maturity and actions to move to the next level. The PSCIT should be used with caution as there is currently a lack of reliability and validity data. PMID:18382154

  9. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  10. Space mission scenario development and performance analysis tool

    Science.gov (United States)

    Kordon, Mark; Baker, John; Gilbert, John; Hanks, David

    2004-01-01

    This paper discusses a new and innovative approach for a rapid spacecraft multi-disciplinary performance analysis using a tool called the Mission Scenario Development Workbench (MSDW). To meet the needs of new classes of space missions, analysis tools with proven models were developed and integrated into a framework to enable rapid trades and analyses between spacecraft designs and operational scenarios during the formulation phase of a mission. Generally speaking, spacecraft resources are highly constrained on deep space missions and this approach makes it possible to maximize the use of existing resources to attain the best possible science return. This approach also has the potential benefit of reducing the risk of costly design changes made later in the design cycle necessary to meet the mission requirements by understanding system design sensitivities early and adding appropriate margins. This paper will describe the approach used by the Mars Science Laboratory Project to accomplish this result.

  11. Development of an Analytical System for Determination of Free Acid via a Joint Method Combining Density and Conductivity Measurement

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Determination of free acid plays an important role in spent nuclear fuel reprocessing. It is necessary to develop a rapid analytical device and method for measuring free acid. A novel analytical system and method was studied to monitor the acidity

  12. Facilitating management learning: Developing critical reflection through reflective tools

    OpenAIRE

    Gray, David E

    2007-01-01

    The aim of this article is to explore how the practice of critical reflection within a management learning process can be facilitated through the application of reflective processes and tools. A distinction is drawn between reflection as a form of individual development (of, say, the reflective practitioner), and critical reflection as a route to collective action and a component of organizational learning and change. Critical reflection, however, is not a process that comes naturally to many...

  13. A Health Diagnostic Tool for Public Development Banks

    OpenAIRE

    Diana Smallridge; Fernando de Olloqui

    2011-01-01

    This study introduces a diagnostic tool for determining the health of a public development bank. It defines in normative terms what good health looks like across various dimensions, which allows the PDB to delineate how it can improve its overall performance and achieve its financial and developmental goals. Due to the variety of mandates and business models used by PDBs, there is no one definition of what constitutes perfect health for a PDB; nonetheless there are common parameters and featu...

  14. Developing a Grid-Based Search and Categorization Tool

    OpenAIRE

    Haya, Glenn; Scholze, Frank; Vigen, Jens

    2003-01-01

    Grid technology has the potential to improve the accessibility of digital libraries. The participants in Project GRACE (Grid Search And Categorization Engine) are in the process of developing a search engine that will allow users to search through heterogeneous resources stored in geographically distributed digital collections. What differentiates this project from current search tools is that GRACE will be run on the European Data Grid, a large distributed network, and will not have a single...

  15. Application of comprehensive engineering and training simulator development tools

    International Nuclear Information System (INIS)

    Recent developments affecting the use of simulation include: 1) the availability of high-productivity simulation tools, 2) the ever-increasing performance of low-cost computers, 3) the workforce's growing aptitude for using personal computers, 4) the rising cost of plant outages. With these conditions, simulation provides more advantages than ever before to assist in selecting from an increasing number of control hardware and software products to resolve issues in plant design, control and operation. (author) 1 figs, 7 refs

  16. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  17. MOOCs as a Professional Development Tool for Librarians

    Directory of Open Access Journals (Sweden)

    Meghan Ecclestone

    2013-11-01

    Full Text Available This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  18. MOOCs as a Professional Development Tool for Librarians

    OpenAIRE

    Meghan Ecclestone

    2013-01-01

    This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  19. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    Science.gov (United States)

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS. PMID:27387996

  20. Support Tools in Formulation Development for Poorly Soluble Drugs.

    Science.gov (United States)

    Fridgeirsdottir, Gudrun A; Harris, Robert; Fischer, Peter M; Roberts, Clive J

    2016-08-01

    The need for solubility enhancement through formulation is a well-known but still problematic issue because of the numbers of poorly water-soluble drugs in development. There are several possible routes that can be taken to increase the bioavailability of drugs intended for immediate-release oral formulation. The best formulation strategy for any given drug will depend on numerous factors, including required dose, shelf life, manufacturability, and the properties of the active pharmaceutical ingredient (API). Choosing an optimal formulation and manufacturing route for a new API is therefore not a straightforward process. Currently, there are several approaches that are used in the pharmaceutical industry to select the best formulation strategy. These differ in complexity and efficiency, but most try to predict which route will best suit the API based on selected molecular parameters such as molecular weight, lipophilicity (logP), and solubility. These methods range from using no tools, trial and error methods through a variety of complex tools from small in vitro or in vivo experiments or high throughput screening, guidance maps, and decision trees to the most complex methods based on computational modelling tools. This review aims to list available support tools and explain how they are used. PMID:27368122

  1. Feasibility assessment tool for urban anaerobic digestion in developing countries.

    Science.gov (United States)

    Lohri, Christian Riuji; Rodić, Ljiljana; Zurbrügg, Christian

    2013-09-15

    This paper describes a method developed to support feasibility assessments of urban anaerobic digestion (AD). The method not only uses technical assessment criteria but takes a broader sustainability perspective and integrates technical-operational, environmental, financial-economic, socio-cultural, institutional, policy and legal criteria into the assessment tool developed. Use of the tool can support decision-makers with selecting the most suitable set-up for the given context. The tool consists of a comprehensive set of questions, structured along four distinct yet interrelated dimensions of sustainability factors, which all influence the success of any urban AD project. Each dimension answers a specific question: I) WHY? What are the driving forces and motivations behind the initiation of the AD project? II) WHO? Who are the stakeholders and what are their roles, power, interests and means of intervention? III) WHAT? What are the physical components of the proposed AD chain and the respective mass and resource flows? IV) HOW? What are the key features of the enabling or disabling environment (sustainability aspects) affecting the proposed AD system? Disruptive conditions within these four dimensions are detected. Multi Criteria Decision Analysis is used to guide the process of translating the answers from six sustainability categories into scores, combining them with the relative importance (weights) attributed by the stakeholders. Risk assessment further evaluates the probability that certain aspects develop differently than originally planned and assesses the data reliability (uncertainty factors). The use of the tool is demonstrated with its application in a case study for Bahir Dar in Ethiopia. PMID:23722149

  2. Developing Tools and Technologies to Meet MSR Planetary Protection Requirements

    Science.gov (United States)

    Lin, Ying

    2013-01-01

    This paper describes the tools and technologies that need to be developed for a Caching Rover mission in order to meet the overall Planetary Protection requirements for future Mars Sample Return (MSR) campaign. This is the result of an eight-month study sponsored by the Mars Exploration Program Office. The goal of this study is to provide a future MSR project with a focused technology development plan for achieving the necessary planetary protection and sample integrity capabilities for a Mars Caching Rover mission.

  3. Development and Validation of Analytical Method for Determination of Uranium Enrichment in Standard Nuclear Material

    International Nuclear Information System (INIS)

    A non-destructive analytical method has been developed for determination of uranium enrichment using hyper pure germanium (HPGe) gamma ray detector. The absolute efficiency was measured experimentally by different radioactive sources at different distances from the detector crystal. The variation of the absolute peak efficiency for 185.7 keV with source to detector distances has been determined and used to estimate the absolute peak efficiency for the nuclear material samples at 185 keV with the help of analytical method. The validity of the developed analytical method has been confirmed by measuring the 235U enrichment of uranium oxide powders (U3O8) contained in cylindrical aluminum cans

  4. Process-Based Quality (PBQ) Tools Development; TOPICAL

    International Nuclear Information System (INIS)

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts

  5. Development to requirements for a procedures software tool

    International Nuclear Information System (INIS)

    In 1989, the Electric Power Research Institute (EPRI) and the Central Research Institute of the Electric Power Industry (CRIEPI) in Japan initiated a joint research program to investigate various interventions to reduce personnel errors and inefficiencies in the maintenance of nuclear power plants. This program, consisting of several interrelated projects, was initiated because of the mutual recognition of the importance of the human element in the efficient and safe operation of utilities and the continuing need to enhance personnel performance to sustain plant safety and availability. This paper summarizes one of the projects, jointly funded by EPRI and CRIEPI, to analyze the requirements for, and prepare a functional description of, a procedures software tool (PST). The primary objective of this project was to develop a description of the features and functions of a software tool that would help procedure writers to improve the quality of maintenance and testing procedures, thereby enhancing the performance of both procedure writers and maintenance personnel

  6. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  7. Web-based information systems development and dynamic organisational change: the need for emergent development tools

    OpenAIRE

    Ramrattan, M; Patel, NV

    2009-01-01

    This paper considers contextual issues relating to the problem of developing web-based information systems in and for emergent organisations. It postulates that the methods available suffer because of sudden and unexpected changing characteristics within the organisation. The Theory of Deferred Action is used as the basis for the development of an emergent development tool. Many tools for managing change in a continuously changing organisation are susceptible to inadequacy. The in...

  8. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  9. ROLE OF EXTERNAL COMMERCIAL BORROWINGS IN DEVELOPING ECONOMY: AN ANALYTICAL STUDY WITH REFERENCE TO INDIAN ECONOMY

    OpenAIRE

    Vivek Sharma; Budheshwar Prasad Singhraul

    2014-01-01

    The prominent question of debate in current scenario is the role of international finance in economic development of India. The objective of this analytical study is to evaluate the performance and impact of international finance on developing economies, in special consideration of Indian economy. The study of this paper is conducted by collecting primary data as well as secondary data. In the developing economies there has been a large need of foreign investment to increase ...

  10. Evaluation of Cross-Platform Mobile Development ToolsDevelopment of an Evaluation Framework

    OpenAIRE

    Öberg, Linus

    2016-01-01

    The aim of this thesis is to determine what cross-platform mobile development tool that is best suited for Vitec and their mobile application ”Teknisk Förvaltning”. But more importantly we will in this thesis develop a generic evaluation framework for assessing cross-platform mobile development tools. With the purpose of making it easy to select the most appropriate tool for a specific mobile application. This was achieved by first, in consideration with Vitec, selecting Cordova + Ionic and X...

  11. Demonstration of Decision Support Tools for Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Shropshire, David Earl; Jacobson, Jacob Jordan; Berrett, Sharon; Cobb, D. A.; Worhach, P.

    2000-11-01

    The Demonstration of Decision Support Tools for Sustainable Development project integrated the Bechtel/Nexant Industrial Materials Exchange Planner and the Idaho National Engineering and Environmental Laboratory System Dynamic models, demonstrating their capabilities on alternative fuel applications in the Greater Yellowstone-Teton Park system. The combined model, called the Dynamic Industrial Material Exchange, was used on selected test cases in the Greater Yellow Teton Parks region to evaluate economic, environmental, and social implications of alternative fuel applications, and identifying primary and secondary industries. The test cases included looking at compressed natural gas applications in Teton National Park and Jackson, Wyoming, and studying ethanol use in Yellowstone National Park and gateway cities in Montana. With further development, the system could be used to assist decision-makers (local government, planners, vehicle purchasers, and fuel suppliers) in selecting alternative fuels, vehicles, and developing AF infrastructures. The system could become a regional AF market assessment tool that could help decision-makers understand the behavior of the AF market and conditions in which the market would grow. Based on this high level market assessment, investors and decision-makers would become more knowledgeable of the AF market opportunity before developing detailed plans and preparing financial analysis.

  12. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  13. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  14. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    Science.gov (United States)

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In

  15. Combining Multiple Measures of Students' Opportunities to Develop Analytic, Text-Based Writing Skills

    Science.gov (United States)

    Correnti, Richard; Matsumura, Lindsay Clare; Hamilton, Laura S.; Wang, Elaine

    2012-01-01

    Guided by evidence that teachers contribute to student achievement outcomes, researchers have been reexamining how to study instruction and the classroom opportunities teachers create for students. We describe our experience measuring students' opportunities to develop analytic, text-based writing skills. Utilizing multiple methods of data…

  16. Development of an Analytical System for Rapid, Remote Determining Concentration and Valence of Uranium and Plutonium

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Concentrations and valence of U and Pu directly shows whether the Purex process is under normal conditions or not. It is necessary to monitor concentrations and valence of U and Pu in real-time.Purposes of this work is to develop an analytical

  17. Analytical approach to developing the transport threshold models of neoclassical tearing modes in tokamaks

    International Nuclear Information System (INIS)

    Analytical solutions of the stationary conduction equation are obtained. The solutions are used for developing the transport threshold models (TTM) of the neoclassical tearing modes (NTM) in tokamaks. The following TTM are considered: collisional, convective, inertial and rotational. These TTM may be the fragments of the more general models of NTM

  18. Development of analytical methods for the separation of plutonium, americium, curium and neptunium from environmental samples

    OpenAIRE

    Salminen, Susanna

    2009-01-01

    In this work, separation methods have been developed for the analysis of anthropogenic transuranium elements plutonium, americium, curium and neptunium from environmental samples contaminated by global nuclear weapons testing and the Chernobyl accident. The analytical methods utilized in this study are based on extraction chromatography. Highly varying atmospheric plutonium isotope concentrations and activity ratios were found at both Kurchatov (Kazakhstan), near the former Semipalatinsk...

  19. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  20. Recent developments in short-lived nuclide activation analysis and analytical efficiency

    International Nuclear Information System (INIS)

    In various applications of neutron activation analysis, wide element concentration and nuclide half-life ranges, overlapping peaks and other interferences in the gamma-spectrum and the request for isotope abundance determination as well as other special problems are encountered which led to the development of a flexible analytical system for the optimization and differentiation of the experimental conditions in order to solve properly these multiparameter problems. The new features were introduced mainly in the analysis of short-lived nuclides with high throughput capability, enhancing also the analytical efficiency and broadening the application range of neutron activation analysis. (author) 4 refs.; 6 figs

  1. Development of Distributed Cache Strategy for Analytic Cluster in an Internet of Things System

    OpenAIRE

    Yang ZHOU

    2016-01-01

    This thesis discusses the development of a distributed cache strategy for an analyt-ic cluster in an IoT system. In this thesis, LRU and Proactive Cache and essential distributed system related concepts are discussed. The study about the approaches for performance optimization, nodes and data distributing in the IoT system are also included. In the IoT system, the cluster for data analysis involves large volume of data and some specific processes such as streaming processing raises a need ...

  2. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    Energy Technology Data Exchange (ETDEWEB)

    Banning, Helena

    2016-03-12

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  3. Selenium Isotope Studies in Plants - Development and Validation of a Novel Geochemical Tool and its Application to Organic Samples

    OpenAIRE

    Banning, Helena

    2016-01-01

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  4. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    International Nuclear Information System (INIS)

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  5. The Development of a Climate Time Line Information Tool

    Science.gov (United States)

    Kowal, D.; McCaffery, M.; Anderson, D.; Habermann, D. E.

    2001-12-01

    The "Climate Time Line" or CTL tool currently in development at the National Geophysical Data Center will provide a climatic and "place-based" context for current weather patterns and a pre-instrumental context for current climate trends. Two audiences-GLOBE students and water managers involved with the Western Water Assessment--are targeted in the pilot project phase to test the CTL as a learning and decision-making support tool. Weather, climate and paleoclimatic observations will be integrated through a web-based interface that can be used for comparing data collected over 10 year, 100 year and 1000+ year periods, and made accessible and meaningful to non-technical users. The Climate Time Line prototype will include the following features: 1) Access to diverse data sets such as NCDC's Historic Climate Network, GLOBE Student Data Archive, World Data Center for Paleoclimatology and historical streamflow data from the USGS; 2) Map Locator/Search Utility for regional inquiries and comparison views; 3) Varying temporal and spatial displays; 4) Tutorial and help sections to guide and support users; 5) Supporting materials including a "Powers of Ten" primer examining variability at various timescales; and 6) Statistical assessment tools. The CTL prototype offers a novel approach in the scientific analysis of climate and hydrology data. It will facilitate inquiries by simplifying access to environmental data. Additionally, it will provide historical timelines for the intended user to compare the development of human cultures in relation to climate trends and variability--promoting an inquiry-rich learning environment. Throughout the pilot project phase, the CTL will undergo evaluation particularly in the area of usability, followed by a pre- and post- assessment of its educational impact on the targeted, non-technical audience. A hypernews workspace has been created to facilitate the development of the CTL. >http://HyperNews.ngdc.noaa.gov/HyperNews/get/ ClimateTimelineProject.html.

  6. Analytical quality assurance procedures developed for the IAEA's Reference Asian Man Project (Phase 2)

    International Nuclear Information System (INIS)

    Analytical quality assurance procedures adopted for use in the IAEA Co-ordinated Research Project on Ingestion and Organ Content of Trace Elements of Importance in Radiological Protection are designed to ensure comparability of the analytical results for Cs, I, Sr, Th, U and other elements in human tissues and diets collected and analysed in nine participating countries. The main analytical techniques are NAA and ICP-MS. For sample preparation, all participants are using identical food blenders which have been centrally supplied after testing for contamination. For quality control of the analyses, six NIST SRMs covering a range of matrices with certified and reference values for the elements of interest have been distributed. A new Japanese reference diet material has also been developed. These quality assurance procedures are summarized here and new data are presented for Cs, I, Sr, Th and U in the NIST SRMs. (author)

  7. NEEMO 20: Science Training, Operations, and Tool Development

    Science.gov (United States)

    Graff, T.; Miller, M.; Rodriguez-Lanetty, M.; Chappell, S.; Naids, A.; Hood, A.; Coan, D.; Abell, P.; Reagan, M.; Janoiko, B.

    2016-01-01

    The 20th mission of the National Aeronautics and Space Administration (NASA) Extreme Environment Mission Operations (NEEMO) was a highly integrated evaluation of operational protocols and tools designed to enable future exploration beyond low-Earth orbit. NEEMO 20 was conducted from the Aquarius habitat off the coast of Key Largo, FL in July 2015. The habitat and its surroundings provide a convincing analog for space exploration. A crew of six (comprised of astronauts, engineers, and habitat technicians) lived and worked in and around the unique underwater laboratory over a mission duration of 14-days. Incorporated into NEEMO 20 was a diverse Science Team (ST) comprised of geoscientists from the Astromaterials Research and Exploration Science (ARES/XI) Division from the Johnson Space Center (JSC), as well as marine scientists from the Department of Biological Sciences at Florida International University (FIU). This team trained the crew on the science to be conducted, defined sampling techniques and operational procedures, and planned and coordinated the science focused Extra Vehicular Activities (EVAs). The primary science objectives of NEEMO 20 was to study planetary sampling techniques and tools in partial gravity environments under realistic mission communication time delays and operational pressures. To facilitate these objectives two types of science sites were employed 1) geoscience sites with available rocks and regolith for testing sampling procedures and tools and, 2) marine science sites dedicated to specific research focused on assessing the photosynthetic capability of corals and their genetic connectivity between deep and shallow reefs. These marine sites and associated research objectives included deployment of handheld instrumentation, context descriptions, imaging, and sampling; thus acted as a suitable proxy for planetary surface exploration activities. This abstract briefly summarizes the scientific training, scientific operations, and tool

  8. WP3 Prototype development for operational planning tool

    Energy Technology Data Exchange (ETDEWEB)

    Kristoffersen, T.; Meibom, P. (Technical Univ. of Denmark. Risoe DTU (Denmark)); Apfelbeck, J.; Barth, R.; Brand, H. (IER, Univ. of Stuttgart (Germany))

    2008-04-15

    This report documents the model development carried out in work package 3 in the SUPWIND project. It was decided to focus on the estimation of the need for reserve power, and on the reservation of reserve power by TSOs. Reserve power is needed to cover deviations from the day-ahead forecasts of electricity load and wind power production, and to cover forced outages of power plants and transmission lines. Work has been carried out to include load uncertainty and forced outages in the two main components of the Wilmar Planning tool namely the Scenario Tree Tool and the Joint Market Model. This work is documented in chapter 1 and 2. The inclusion of load uncertainty and forced outages in the Scenario Tree Tool enables calculation of the demand for reserve power depending on the forecast horizon. The algorithm is given in Section 3.1. The design of a modified version of the Joint Market Model enabling estimation of the optimal amount of reserve power to reserve day-ahead before the actual operation hour is documented in Section 3.2. With regard to the evaluation of a power system, its ability to cope with extreme events is crucial to be investigated. Chapter 4 gives a definition of such extreme events. Further, the methodology to identify extreme events on the basis of the existing tools is described. Within the SUPWIND consortium there has been an interest in using the Joint Market Model to model smaller parts of a power system but with more detailed representation of the transmission and distribution grid. Chapter 5 documents this work. (author)

  9. An Analytic Approach to Developing Transport Threshold Models of Neoclassical Tearing Modes in Tokamaks

    International Nuclear Information System (INIS)

    Transport threshold models of neoclassical tearing modes in tokamaks are investigated analytically. An analysis is made of the competition between strong transverse heat transport, on the one hand, and longitudinal heat transport, longitudinal heat convection, longitudinal inertial transport, and rotational transport, on the other hand, which leads to the establishment of the perturbed temperature profile in magnetic islands. It is shown that, in all these cases, the temperature profile can be found analytically by using rigorous solutions to the heat conduction equation in the near and far regions of a chain of magnetic islands and then by matching these solutions. Analytic expressions for the temperature profile are used to calculate the contribution of the bootstrap current to the generalized Rutherford equation for the island width evolution with the aim of constructing particular transport threshold models of neoclassical tearing modes. Four transport threshold models, differing in the underlying competing mechanisms, are analyzed: collisional, convective, inertial, and rotational models. The collisional model constructed analytically is shown to coincide exactly with that calculated numerically; the reason is that the analytical temperature profile turns out to be the same as the numerical profile. The results obtained can be useful in developing the next generation of general threshold models. The first steps toward such models have already been made

  10. Determination of glycols in air: development of sampling and analytical methodology and application to theatrical smokes.

    Science.gov (United States)

    Pendergrass, S M

    1999-01-01

    Glycol-based fluids are used in the production of theatrical smokes in theaters, concerts, and other stage productions. The fluids are heated and dispersed in aerosol form to create the effect of a smoke, mist, or fog. There have been reports of adverse health effects such as respiratory irritation, chest tightness, shortness of breath, asthma, and skin rashes. Previous attempts to collect and quantify the aerosolized glycols used in fogging agents have been plagued by inconsistent results, both in the efficiency of collection and in the chromatographic analysis of the glycol components. The development of improved sampling and analytical methodology for aerosolized glycols was required to assess workplace exposures more effectively. An Occupational Safety and Health Administration versatile sampler tube was selected for the collection of ethylene glycol, propylene glycol, 1,3-butylene glycol, diethylene glycol, triethylene glycol, and tetraethylene glycol aerosols. Analytical methodology for the separation, identification, and quantitation of the six glycols using gas chromatography/flame ionization detection is described. Limits of detection of the glycol analytes ranged from 7 to 16 micrograms/sample. Desorption efficiencies for all glycol compounds were determined over the range of study and averaged greater than 90%. Storage stability results were acceptable after 28 days for all analytes except ethylene glycol, which was stable at ambient temperature for 14 days. Based on the results of this study, the new glycol method was published in the NIOSH Manual of Analytical Methods. PMID:10462779

  11. Development and Evaluation of a Riparian Buffer Mapping Tool

    Science.gov (United States)

    Milheim, Lesley E.; Claggett, Peter R.

    2008-01-01

    Land use and land cover within riparian areas greatly affect the conditions of adjacent water features. In particular, riparian forests provide many environmental benefits, including nutrient uptake, bank stabilization, steam shading, sediment trapping, aquatic and terrestrial habitat, and stream organic matter. In contrast, residential and commercial development and associated transportation infrastructure increase pollutant and nutrient loading and change the hydrologic characteristics of the landscape, thereby affecting both water quality and habitat. Restoring riparian areas is a popular and cost effective restoration technique to improve and protect water quality. Recognizing this, the Chesapeake Executive Council committed to restoring 10,000 miles of riparian forest buffers throughout the Chesapeake Bay watershed by the year 2010. In 2006, the Chesapeake Executive Council further committed to 'using the best available...tools to identify areas where retention and expansion of forests is most needed to protect water quality'. The Chesapeake Bay watershed encompasses 64,000 square miles, including portions of six States and Washington, D.C. Therefore, the interpretation of remotely sensed imagery provides the only effective technique for comprehensively evaluating riparian forest protection and restoration opportunities throughout the watershed. Although 30-meter-resolution land use and land cover data have proved useful on a regional scale, they have not been equally successful at providing the detail required for local-scale assessment of riparian area characteristics. Use of high-resolution imagery (HRI) provides sufficient detail for local-scale assessments, although at greater cost owing to the cost of the imagery and the skill and time required to process the data. To facilitate the use of HRI for monitoring the extent of riparian forest buffers, the U.S. Forest Service and the U.S. Geological Survey Eastern Geographic Science Center funded the

  12. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    Directory of Open Access Journals (Sweden)

    Carmen – Elena DOBROTĂ

    2015-12-01

    Full Text Available Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidate a performing public administration it was a challenging task, taking into account the types of interventions within Operational Programme Administrative Capacity Development 2007 – 2013 and the continuous changes in the economic and social environment in Romania and Europe. The aim of this article is to provide a short description of the approach used by the Managing Authority for OPACD within the performance management of the structural funds in Romania between 2008 and 2014. The paper offers a broad image of the way in which evaluations (ad-hoc, intermediate and performance were used in different stages of OP implementation as a tool of management.

  13. Development of remote control Java tool for PLC

    International Nuclear Information System (INIS)

    In J-PARC control System, machinery having a network controller (NTC) is operated with EPICS. A NTC can also connect with an upper level computer directly, while in EPICS, an IOC is used for communication between an upper level computer and a NTC. In case of a communication error, one must investigate the source of the error, whether it is due to the NTC. It becomes very effective if one can switch the connection method between EPICS and direct connection on an operation screen. We report on a development of a tool realizing such functionalities. (author)

  14. Solar Energy Harvesting in WSN with Development Tool

    OpenAIRE

    Guo, Jiabing

    2011-01-01

    The aim of this thesis was to introduce a solar energy and wsn, as well as to introduce how the solar energy harvesting development tool helps create a perpetually powered wireless sensor network and provides enough power to run a wireless sensor application with no additional batteries. Solar energy is radiant light and heat from the sun. WSN is widely used in modern society. In this project these two areas were combined and it was studied to use the solar energy harvesting in a wireless...

  15. Development of a biogas planning tool for project owners

    DEFF Research Database (Denmark)

    Fredenslund, Anders Michael; Kjær, Tyge

    A spreadsheet model was developed, which can be used as a tool in the initial phases of planning a centralized biogas plant in Denmark. The model assesses energy production, total plant costs, operational costs and revenues and effect on greenhouse gas emissions. Two energy utilization alternatives...... are considered: Combined heat and power and natural gas grid injection. The main input to the model is the amount and types of substrates available for anaerobic digestion. By substituting the models’ default values with more project specific information, the model can be used in a biogas projects...

  16. Recent developments in analytical techniques for characterization of ultra pure materials—An overview

    Indian Academy of Sciences (India)

    V Balaram

    2005-07-01

    With continual decrease of geometries used in modern IC devices, the trace metal impurities of process materials and chemicals used in their manufacture are moving to increasingly lower levels, i.e. ng/g and pg/g levels. An attempt is made to give a brief overview of the use of different analytical techniques in the analysis of trace metal impurities in ultrapure materials, such as, high-purity tellurium (7N), high purity quartz, high-purity copper (6N), and high purity water and mineral acids. In recent times mass spectrometric techniques such as ICP–MS, GD–MS and HR–ICP–MS with their characteristic high sensitivity and less interference effects were proved to be extremely useful in this field. A few examples of such application studies using these techniques are outlined. The usefulness of other analytical techniques such as F–AAS, GF–AAS, XRF, ICP–AES and INAA was also described. Specific advantages of ICP–MS and HR–ICP–MS such as high sensitivity, limited interference effects, element coverage and speed would make them powerful analytical tools for the characterization of ultrapure materials in future.

  17. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is...

  18. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    OpenAIRE

    Dong-Kyu Lee; Dong Kyu Lim; Jung A. Um; Chang Ju Lim; Ji Yeon Hong; Young A Yoon; Yeonsuk Ryu; Hyo Jin Kim; Hi Jae Cho; Jeong Hill Park; Young Bae Seo; Kyunga Kim; Johan Lim; Sung Won Kwon; Jeongmi Lee

    2014-01-01

    Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different d...

  19. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    OpenAIRE

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I.; Nitz, Mark; Mitchell A. Winnik

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for...

  20. Simulation of spin dynamics: a tool in MRI system development

    International Nuclear Information System (INIS)

    Magnetic Resonance Imaging (MRI) is a routine diagnostic tool in the clinics and the method of choice in soft-tissue contrast medical imaging. It is an important tool in neuroscience to investigate structure and function of the living brain on a systemic level. The latter is one of the driving forces to further develop MRI technology, as neuroscience especially demands higher spatiotemporal resolution which is to be achieved through increasing the static main magnetic field, B0. Although standard MRI is a mature technology, ultra high field (UHF) systems, at B0 ≥ 7 T, offer space for new technical inventions as the physical conditions dramatically change. This work shows that the development strongly benefits from computer simulations of the measurement process on the basis of a semi-classical, nuclear spin-1/2 treatment given by the Bloch equations. Possible applications of such simulations are outlined, suggesting new solutions to the UHF-specific inhomogeneity problems of the static main field as well as the high-frequency transmit field.

  1. Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool.

    Science.gov (United States)

    Newman, Dina L; Snyder, Christopher W; Fisk, J Nick; Wright, L Kate

    2016-01-01

    Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select-format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all levels of undergraduate biology. Questions for the Central Dogma Concept Inventory (CDCI) tool were developed and iteratively revised based on student language and review by experts. The ability of the CDCI to discriminate between levels of understanding of the central dogma is supported by field testing (N= 54), and large-scale beta testing (N= 1733). Performance on the assessment increased with experience in biology; scores covered a broad range and showed no ceiling effect, even with senior biology majors, and pre/posttesting of a single class focused on the central dogma showed significant improvement. The multiple-select format reduces the chances of correct answers by random guessing, allows students at different levels to exhibit the extent of their knowledge, and provides deeper insight into the complexity of student thinking on each theme. To date, the CDCI is the first tool dedicated to measuring student thinking about the central dogma of molecular biology, and version 5 is ready to use. PMID:27055775

  2. Analytical approaches for assaying metallodrugs in biological samples: recent methodological developments and future trends.

    Science.gov (United States)

    Timerbaev, Andrei; Sturup, Stefan

    2012-03-01

    Contemporary medicine increasingly relies on metal-based drugs and correspondingly growing in importance is the monitoring of the drugs and their metabolites in biological samples. Over the last decade, a range of analytical techniques have been developed in order to improve administration strategies for clinically approved compounds and understand pharmacokinetics, pharmacodynamics, and metabolism of new drugs so as ultimately to make their clinical development more effective. This paper gives an overview of various separation and detection methods, as well as common sample preparation strategies, currently in use to achieve the intended goals. The critical discussion of existing analytical technologies encompasses notably their detection capability, ability to handle biological matrices with minimum pretreatment, sample throughput, and cost efficiency. The main attention is devoted to those applications that are progressed to real-world biosamples and selected examples are given to illustrate the overall performance and applicability of advanced analytical systems. Also emphasized is the emerging role of inductively coupled plasma mass spectrometry (ICP-MS), both as a standalone instrument (for determination of metals originating from drug compounds) and as an element-specific detector in combinations with liquid chromatography or capillary electrophoresis (for drug metabolism studies). An increasing number of academic laboratories are using ICP-MS technology today, and this review will focus on the analytical possibilities of ICP-MS which would before long provide the method with the greatest impact on the clinical laboratory. PMID:21838702

  3. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  4. Development of a clinically applicable tool for bone density assessment

    International Nuclear Information System (INIS)

    To assess the accuracy and reliability of new software for radiodensitometric evaluations. A densitometric tool developed by MevisLab registered was used in conjunction with intraoral radiographs of the premolar region in both in vivo and laboratory settings. An aluminum step wedge was utilized for comparison of grey values. After computer-aided segmentation, the interproximal bone between the premolars was assessed in order to determine the mean grey value intensity of this region and convert it to a thickness in aluminum. Evaluation of the tool was determined using bone mineral density (BMD) values derived from decalcified human bone specimens as a reference standard. In vivo BMD data was collected from 35 patients as determined with dual X-ray absorptiometry (DXA). The intra and interobserver reliability of this method was assessed by Bland and Altman Plots to determine the precision of this tool. In the laboratory study, the threshold value for detection of bone loss was 6.5%. The densitometric data (mm Al eq.) was highly correlated with the jaw bone BMD, as determined using dual X-ray absorptiometry (r=0.96). For the in vivo study, the correlations between the mm Al equivalent of the average upper and lower jaw with the lumbar spine BMD, total hip BMD and femoral neck BMD were 0.489, 0.537 and 0.467, respectively (P<0.05). For the intraobserver reliability, a Bland and Altman plot showed that the mean difference ±1.96 SD were within ±0.15 mm Al eq. with the mean difference value small than 0.003 mm Al eq. For the interobserver reliability, the mean difference ±1.96 SD were within ±0.11 mm Al eq. with the mean difference of 0.008 mm Al eq. A densitometric software tool has been developed, that is reliable for bone density assessment. It now requires further investigation to evaluate its accuracy and clinical applicability in large scale studies. (orig.)

  5. Tools for tracking progress. Indicators for sustainable energy development

    International Nuclear Information System (INIS)

    A project on 'Indicators for Sustainable Energy Development (ISED)' was introduced by the IAEA as a part of its work programme on Comparative Assessment of Energy Sources for the biennium 1999-2000. It is being pursued by the Planning and Economic Studies Section of the Department of Nuclear Energy. The envisaged tasks are to: (1) identify the main components of sustainable energy development and derive a consistent set of appropriate indicators, keeping in view the indicators for Agenda 21, (2) establish relationship of ISED with those of the Agenda 21, and (3) review the Agency's databases and tools to determine the modifications required to apply the ISED. The first two tasks are being pursued with the help of experts from various international organizations and Member States. In this connection two expert group meetings were held, one in May 1999 and the other in November 1999. The following nine topics were identified as the key issues: social development; economic development; environmental congeniality and waste management; resource depletion; adequate provision of energy and disparities; energy efficiency; energy security; energy supply options; and energy pricing. A new conceptual framework model specifically tuned to the energy sector was developed, drawing upon work by other organizations in the environmental area. Within the framework of this conceptual model, two provisional lists of ISED - a full list and a core list - have been prepared. They cover indicators for the following energy related themes and sub-themes under the economic, social and environmental dimensions of sustainable energy development: Economic dimension: Economic activity levels; End-use energy intensities of selected sectors and different manufacturing industries; energy supply efficiency; energy security; and energy pricing. Social dimension: Energy accessibility and disparities. Environmental dimension: Air pollution (urban air quality; global climate change concern); water

  6. ANALYTICAL SUPPORT OF REQUIREMENTS DEVELOPMENT FOR INTELLIGENT E-LEARNING SYSTEMS

    OpenAIRE

    Pishchukhina, O.

    2013-01-01

    The problems of analysis and modeling of requirements while development of e-learning systems with feedback are considered. Subject area of requirements development is determined, choice of CASE-tools for creating models from generated subject area is explained. Visual models of functional requirements and user requirements in the form of diagrams showing the development of behavior and the logical structure of the system are developed. Simulation models explain the events and the correspondi...

  7. Development of software tools for exposure assessment in urban areas

    International Nuclear Information System (INIS)

    The main goal of this work is to present a software tool for radio frequency exposure assessment which integrates propagation models applied in different environments, dosimetric data sets recorded by personal exposure meters, geographic information systems and web technologies. This tool aims to complement the deployment of a network of electromagnetic field monitoring stations in the city of Valladolid for the control of E-field levels. Data sets of measurements were gathered in the city of Valladolid with a personal exposure meter in December 2006 and January 2007. A diary of measurements was maintained where position, time, relatively close base stations and potential sources of interference were noted down. In addition to this, empirical propagation models were implanted by means of a cadastral map that was used to create 2D maps and 3D models of the city of Valladolid. A Java interface was developed to add simulation parameters and to manage the different layers of information. We have contrasted simulated E-field with the dosimetric data, both in indoor and outdoor environments. Selective frequency spot measures were made with a triaxial isotropic probe and a portable spectrum analyzer. These measures showed a good agreement with personal exposure meter and electromagnetic field monitoring station in the 900 MHz band. In general, electromagnetic field exposure from base stations is low; dosimeter threshold lowest level (0.05 V/m) was generally reached only in regions of line-of-sight, Near-line-of-sight and street canyons. Indoor main contributions were obtained in rooms with LOS to a base station. In conclusion, a better understanding of the exposure to radio frequency can be reached by the integration of different sources: electromagnetic field monitoring stations installed on flat-roof tops, PEM data gathered at street level and propagation modelling tools. (author)

  8. On the development of the METAR family of inspection tools

    International Nuclear Information System (INIS)

    Since 1998, Hydro Quebec Research Centre (IREQ), in collaboration with Gentilly-2, has been working on the development of inspection devices for the feeder tubes of CANDU power plants. The first tool to come out of this work was the Metar bracelet, now used throughout the CANDU utilities, consisting of 14 ultrasonic probes held in place in a rigid bracelet to measure the thickness of the pipes and moved around manually along the pipe. Following the success of the Metar, a motorized version, i.e. the Crawler, has been developed to inspect beyond the operator arm's reach to access hard to reach place or further down the pipes in the reactor. This new system has been tested at 3 different stations and will be commercially available soon. Finally, the same technology was used to develop a motorized 2-axis crack detection device to answer new concerns about the feeder. Other configurations, depending on the demands from the industry, could also be developed for specific inspection needs, for example; inspection of the graylock welds, 360o inspection of feeders, or multitasking inspection on a single frame, etc. Most of the designs shown in this article have been or will be patented and are, or will be, licensed to a partner company to make them commercially available to the industry. This paper gives a brief history of the project and a description of the technologies developed in the last 5 years concerning feeder inspection. (author)

  9. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    Science.gov (United States)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  10. Cleaner production - a tool for sustainable environmental development

    International Nuclear Information System (INIS)

    Industrial Development and Production with no regard for environmental impacts creates water and air pollution, soil degradation, and large-scale global impacts such as acid rain, global warming and ozone depletion. To create more sustainable methods of industrial production, there needs to be a shift in attitudes away from control towards pollution prevention and management. Cleaner Production (CP) refers to a management process that seeks out and eliminates the causes of pollution, waste generation and resource consumption at their source through input reductions or substitutions, pollution prevention, internal recycling and more efficient production technology and processes for sustainable environmental development. The objective of cleaner production is to avoid generating pollution in the first place, which frequently cuts costs, reduces risks associated with liability, and identifies new market opportunities. Introducing cleaner production has become a goal to improve the competitiveness through increased eco-efficiency. CP is a business strategy for enhancing productivity and environmental performance for overall socio-economic development. The environmental and economic benefits can only be achieved by implementing cleaner production tools. The CP assessment methodology is used to systematically identify and evaluate the waste minimization opportunities and facilitate their implementation in industries. It refers to how goods and services are produced with the minimum environmental impact under present technological and economic limits. CP shares characteristics with many environmental management tools such as Environmental Assessment or Design for Environment by including them among the technological options for reducing material and energy intensiveness in production, as well as facilitating ruse trough remanufacturing and recycling. It is thus an extension of the total quality management process. The CP program has been successfully implemented in

  11. Development of reactor design aid tool using virtual reality technology

    International Nuclear Information System (INIS)

    A new type of aid system for fusion reactor design, to which the virtual reality (VR) visualization and sonification techniques are applied, is developed. This system provides us with an intuitive interaction environment in the VR space between the observer and the designed objects constructed by the conventional 3D computer-aided design (CAD) system. We have applied the design aid tool to the heliotron-type fusion reactor design activity FFHR2m [A. Sagara, S. Imagawa, O. Mitarai, T. Dolan, T. Tanaka, Y. Kubota, et al., Improved structure and long -life blanket concepts for heliotron reactors, Nucl. Fusion 45 (2005) 258-263] on the virtual reality system CompleXcope [Y. Tamura, A. Kageyama, T. Sato, S. Fujiwara, H. Nakamura, Virtual reality system to visualize and auralize numerical imulation data, Comp. Phys. Comm. 142 (2001) 227-230] of the National Institute for Fusion Science, Japan, and have evaluated its performance. The tool includes the functions of transfer of the observer, translation and scaling of the objects, recording of the operations and the check of interference

  12. Development of environmental tools for anopheline larval control

    Directory of Open Access Journals (Sweden)

    Mweresa Collins K

    2011-07-01

    Full Text Available Abstract Background Malaria mosquitoes spend a considerable part of their life in the aquatic stage, rendering them vulnerable to interventions directed to aquatic habitats. Recent successes of mosquito larval control have been reported using environmental and biological tools. Here, we report the effects of shading by plants and biological control agents on the development and survival of anopheline and culicine mosquito larvae in man-made natural habitats in western Kenya. Trials consisted of environmental manipulation using locally available plants, the introduction of predatory fish and/or the use of Bacillus thuringiensis var. israelensis (Bti in various combinations. Results Man-made habitats provided with shade from different crop species produced significantly fewer larvae than those without shade especially for the malaria vector Anopheles gambiae. Larval control of the African malaria mosquito An. gambiae and other mosquito species was effective in habitats where both predatory fish and Bti were applied, than where the two biological control agents were administered independently. Conclusion We conclude that integration of environmental management techniques using shade-providing plants and predatory fish and/or Bti are effective and sustainable tools for the control of malaria and other mosquito-borne disease vectors.

  13. Analytical Ultracentrifugation and Its Role in Development and Research of Therapeutical Proteins.

    Science.gov (United States)

    Liu, Jun; Yadav, Sandeep; Andya, James; Demeule, Barthélemy; Shire, Steven J

    2015-01-01

    The historical contributions of analytical ultracentrifugation (AUC) to modern biology and biotechnology drug development and research are discussed. AUC developed by Svedberg was used to show that proteins are actually large defined molecular entities and also provided the first experimental verification for the semiconservative replication model for DNA initially proposed by Watson and Crick. This chapter reviews the use of AUC to investigate molecular weight of recombinant-DNA-produced proteins, complex formation of antibodies, intermolecular interactions in dilute and high concentration protein solution, and their impact on physical properties such as solution viscosity. Recent studies using a "competitive binding" analysis by AUC have been useful in critically evaluating the design and interpretation of surface plasmon resonance measurements and are discussed. The future of this technology is also discussed including prospects for a new higher precision analytical ultracentrifuge. PMID:26412663

  14. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    Science.gov (United States)

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. PMID:27429366

  15. Requirements for Product Development Self-Assessment Tools

    OpenAIRE

    Knoblinger, Christoph; Oehmen, Josef; Rebentisch, Eric; Seering, Warren; Helten, Katharina

    2011-01-01

    The successful execution of complex PD projects still poses major challenges for companies. One approach companies can use to improve their performance is self-assessment tools to optimize their organization and processes. This paper investigates the requirements regarding self-assessment tools for PD organizations. It summarizes the current literature on PD-related self-assessment tools and derives tool requirements from an industry focus group (US aerospace and defense industry) as well as ...

  16. Proceedings of the national seminar on recent developments in electro analytical techniques: souvenir and abstracts

    International Nuclear Information System (INIS)

    In view of deliberations on 'Recent Developments in Electro Analytical Techniques' with special emphasis on batteries, fuel cells, biosensors, chemical sensors modified electrodes, nano electrodes, electrode synthesis and co-ordination compounds, go a long way in creating the necessary awareness and enthusiasm amongst students, young scholars and industrialists to lay their attention on the subject. Papers relevant to INIS are indexed separately

  17. Role of roentgenospectral analysis in development of analytical control and technology of quantitative metallurgy

    International Nuclear Information System (INIS)

    Application of RK-5975 X-ray spectrometer to determine amounts of Ti, V, Cr, Co, Ni, Cu, Nb, Mo, W in steels is considered. The amounts of the elements numbered are in the range of 0.1-100 %. Duration of one sample analysis does not exceed 5 min. The part of X-ray spectral method of the metal chemical composition analysis in the development of technology of electric steel industry and analytical control level increase is discussed

  18. Development of analytical techniques for the characterization of natural and anthropogenic compounds in fine particulate matter

    OpenAIRE

    Piazzalunga,

    2007-01-01

    Aerosol is of central importance for atmospheric chemistry and physics, for the biosphere, the climate and public health. The primary parameters that determine the environmental and health effects of aerosol particles are their concentration and chemical composition. In this work we have developed the analytical techniques to study particulate matter composition. The knowledge of PM composition can be useful to identify the main PM sources, the health risk and the formation or depositio...

  19. Development of Air Sampling and Analytical Methods for Acetoin, Diacetyl, and 2,3-Pentanedione

    OpenAIRE

    Takaku-Pugh, Sayaka

    2012-01-01

    Acetoin, diacetyl, and 2,3-pentanedione are artificial butter flavoring ingredients. Occupational exposures to diacetyl are associated with severe respiratory disease including bronchiolitis obliterans. Dynamic and passive air sampling and analytical methods were developed for simultaneous sampling of these ketones using 10 % (w/w) O-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine hydrochloride (PFBHA) on Tenax TA (80/100 mesh). PFBHA O-oximes of the ketones were synthesized with above 95 % pur...

  20. Analytical model development of an eddy-current-based non-contacting steel plate conveyance system

    International Nuclear Information System (INIS)

    A concise model for analyzing and predicting the quasi-static electromagnetic characteristics of an eddy-current-based non-contacting steel plate conveyance system has been developed. Confirmed by three-dimensional (3-D) finite element analysis (FEA), adequacy of the analytical model can be demonstrated. Such an effective approach, which can be conveniently used by the potential industries for preliminary system operational performance evaluations, will be essential for designers and on-site engineers

  1. DEVELOPMENT OF ANALYTICAL METHODS IN METABOLOMICS FOR THE STUDY OF HEREDITARY AND ACQUIRED GENETIC DISEASE

    OpenAIRE

    Arvonio, Raffaele

    2011-01-01

    METABOLOMICS AND MASS SPECTROMETRY The research project take place in the branch of metabolomics, which involves the systematic study of the metabolites present in a cell and in this area MS, thanks to its potential to carry out controlled experiments of fragmentation, plays a role as a key methodology for identification of various metabolites. The work of thesis project is focused on the analytical methods development for the diagnosis of metabolic diseases and is divided as follows: ...

  2. The Development of Bio-Analytical Techniques for the Treatment of Psoriasis and Related Skin Disorders.

    OpenAIRE

    Hollywood, Katherine

    2010-01-01

    AbstractThe University of ManchesterKatherine Anne Hollywood: June 2010Degree of Doctor of Philosophy in the Faculty of Engineering and Physical SciencesThe Development of Bio-Analytical Techniques for the Treatment of Psoriasis and Related Skin Disorders.In this investigation a number of post-genomic technologies have be applied to study the dermatological disorders of psoriasis and keloid disease. In spite of considerable research focus on these diseases the pathogenesis remains unclear an...

  3. Programming of development of commune with utilization of analytic hierarchic process

    Directory of Open Access Journals (Sweden)

    Aleksandra Łuczak

    2010-01-01

    Full Text Available The paper is a trial of application of analytical hierarchy process to work out scenarios of development of rural commune of Wielkopolska province. In the proposed method consists in building a hierarchical scheme. The scheme covers the general goal which is ensure best by the multi-functional development in administrative district, specific (basic goals and within each goal, a package of a activities can be distinguished which were the basis to work out scenarios of the development of commune. It allowed to choose the best scenario for the administrative district.

  4. Development of in-situ visualization tool for PIC simulation

    International Nuclear Information System (INIS)

    As the capability of a supercomputer is improved, the sizes of simulation and its output data also become larger and larger. Visualization is usually carried out on a researcher's PC with interactive visualization software after performing the computer simulation. However, the data size is becoming too large to do it currently. A promising answer is in-situ visualization. For this case a simulation code is coupled with the visualization code and visualization is performed with the simulation on the same supercomputer. We developed an in-situ visualization tool for particle-in-cell (PIC) simulation and it is provided as a Fortran's module. We coupled it with a PIC simulation code and tested the coupled code on Plasma Simulator supercomputer, and ensured that it works. (author)

  5. Developing a Grid-based search and categorization tool

    CERN Document Server

    Haya, Glenn; Vigen, Jens

    2003-01-01

    Grid technology has the potential to improve the accessibility of digital libraries. The participants in Project GRACE (Grid Search And Categorization Engine) are in the process of developing a search engine that will allow users to search through heterogeneous resources stored in geographically distributed digital collections. What differentiates this project from current search tools is that GRACE will be run on the European Data Grid, a large distributed network, and will not have a single centralized index as current web search engines do. In some cases, the distributed approach offers advantages over the centralized approach since it is more scalable, can be used on otherwise inaccessible material, and can provide advanced search options customized for each data source.

  6. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  7. Development of Wet-Etching Tools for Precision Optical Figuring

    International Nuclear Information System (INIS)

    This FY03 final report on Wet Etch Figuring involves a 2D thermal tool. Its purpose is to flatten (0.3 to 1 mm thickness) sheets of glass faster thus cheaper than conventional sub aperture tools. An array of resistors on a circuit board was used to heat acid over the glass Optical Path Difference (OPD) thick spots and at times this heating extended over the most of the glass aperture. Where the acid is heated on the glass it dissolves faster. A self-referencing interferometer measured the glass thickness, its design taking advantage of the parallel nature and thinness of these glass sheets. This measurement is used in close loop control of the heating patterns of the circuit board thus glass and acid. Only the glass and acid were to be moved to make the tool logistically simple to use in mass production. A set of 4-circuit board, covering 80 x 80-cm aperture was ordered, but only one 40 x 40-cm board was put together and tested for this report. The interferometer measurement of glass OPD was slower than needed on some glass profiles. Sometimes the interference fringes were too fine to resolve which would alias the sign of the glass thickness profile. This also caused the phase unwrapping code (FLYNN) to struggle thus run slowly at times taking hours, for a 10 inch square area. We did extensive work to improve the speed of this code. We tried many different phase unwrapping codes. Eventually running (FLYNN) on a farm of networked computers. Most of the work reported here is therefore limited to a 10-inch square aperture. Researched into fabricating a better interferometer lens from Plexiglas so to have less of the scattered light issues of Fresnel lens groves near field scattering patterns, this set the Nyquest limit. There was also a problem with the initial concept of wetting the 1737 glass on its bottom side with acid. The wetted 1737 glass developed an Achromatic AR coating, spoiling the reflection needed to see glass thickness interference fringes. In response

  8. Development of Wet-Etching Tools for Precision Optical Figuring

    Energy Technology Data Exchange (ETDEWEB)

    Rushford, M C; Dixit, S N; Hyde, R; Britten, J A; Nissen, J; Aasen, M; Toeppen, J; Hoaglan, C; Nelson, C; Summers, L; Thomas, I

    2004-01-27

    This FY03 final report on Wet Etch Figuring involves a 2D thermal tool. Its purpose is to flatten (0.3 to 1 mm thickness) sheets of glass faster thus cheaper than conventional sub aperture tools. An array of resistors on a circuit board was used to heat acid over the glass Optical Path Difference (OPD) thick spots and at times this heating extended over the most of the glass aperture. Where the acid is heated on the glass it dissolves faster. A self-referencing interferometer measured the glass thickness, its design taking advantage of the parallel nature and thinness of these glass sheets. This measurement is used in close loop control of the heating patterns of the circuit board thus glass and acid. Only the glass and acid were to be moved to make the tool logistically simple to use in mass production. A set of 4-circuit board, covering 80 x 80-cm aperture was ordered, but only one 40 x 40-cm board was put together and tested for this report. The interferometer measurement of glass OPD was slower than needed on some glass profiles. Sometimes the interference fringes were too fine to resolve which would alias the sign of the glass thickness profile. This also caused the phase unwrapping code (FLYNN) to struggle thus run slowly at times taking hours, for a 10 inch square area. We did extensive work to improve the speed of this code. We tried many different phase unwrapping codes. Eventually running (FLYNN) on a farm of networked computers. Most of the work reported here is therefore limited to a 10-inch square aperture. Researched into fabricating a better interferometer lens from Plexiglas so to have less of the scattered light issues of Fresnel lens groves near field scattering patterns, this set the Nyquest limit. There was also a problem with the initial concept of wetting the 1737 glass on its bottom side with acid. The wetted 1737 glass developed an Achromatic AR coating, spoiling the reflection needed to see glass thickness interference fringes. In response

  9. Development of AN All-Purpose Free Photogrammetric Tool

    Science.gov (United States)

    González-Aguilera, D.; López-Fernández, L.; Rodriguez-Gonzalvez, P.; Guerrero, D.; Hernandez-Lopez, D.; Remondino, F.; Menna, F.; Nocerino, E.; Toschi, I.; Ballabeni, A.; Gaiani, M.

    2016-06-01

    Photogrammetry is currently facing some challenges and changes mainly related to automation, ubiquitous processing and variety of applications. Within an ISPRS Scientific Initiative a team of researchers from USAL, UCLM, FBK and UNIBO have developed an open photogrammetric tool, called GRAPHOS (inteGRAted PHOtogrammetric Suite). GRAPHOS allows to obtain dense and metric 3D point clouds from terrestrial and UAV images. It encloses robust photogrammetric and computer vision algorithms with the following aims: (i) increase automation, allowing to get dense 3D point clouds through a friendly and easy-to-use interface; (ii) increase flexibility, working with any type of images, scenarios and cameras; (iii) improve quality, guaranteeing high accuracy and resolution; (iv) preserve photogrammetric reliability and repeatability. Last but not least, GRAPHOS has also an educational component reinforced with some didactical explanations about algorithms and their performance. The developments were carried out at different levels: GUI realization, image pre-processing, photogrammetric processing with weight parameters, dataset creation and system evaluation. The paper will present in detail the developments of GRAPHOS with all its photogrammetric components and the evaluation analyses based on various image datasets. GRAPHOS is distributed for free for research and educational needs.

  10. Facilities as teaching tools: A transformative participatory professional development experience

    Science.gov (United States)

    Wilson, Eric A.

    Resource consumption continues to increase as the population grows. In order to secure a sustainable future, society must educate the next generation to become "sustainability natives." Schools play a pivotal role in educating a sustainability-literate society. However, a disconnect exists between the hidden curriculum of the built environment and the enacted curriculum. This study employs a transformative participatory professional development model to instruct teachers on how to use their school grounds as teaching tools for the purpose of helping students make explicit choices in energy consumption, materials use, and sustainable living. Incorporating a phenomenological perspective, this study considers the lived experience of two sustainability coordinators. Grounded theory provides an interpretational context for the participants' interactions with each other and the professional development process. Through a year long professional development experience - commencing with an intense, participatory two-day workshop -the participants discussed challenges they faced with integrating facilities into school curriculum and institutionalizing a culture of sustainability. Two major needs were identified in this study. For successful sustainability initiatives, a hybrid model that melds top-down and bottom-up approaches offers the requisite mix of administrative support, ground level buy-in, and excitement vis-a-vis sustainability. Second, related to this hybrid approach, K-12 sustainability coordinators ideally need administrative capabilities with access to decision making, while remaining connected to students in a meaningful way, either directly in the classroom, as a mentor, or through work with student groups and projects.

  11. The development of a two-component force dynamometer and tool control system for dynamic machine tool research

    Science.gov (United States)

    Sutherland, I. A.

    1973-01-01

    The development is presented of a tooling system that makes a controlled sinusoidal oscillation simulating a dynamic chip removal condition. It also measures the machining forces in two mutually perpendicular directions without any cross sensitivity.

  12. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  13. Development and implementation of information systems for the DOE's National Analytical Management Program (NAMP)

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) faces a challenging environmental management effort, including environmental protection, environmental restoration, waste management, and decommissioning. This effort requires extensive sampling and analysis to determine the type and level of contamination and the appropriate technology for cleanup, and to verify compliance with environmental regulations. Data obtained from these sampling and analysis activities are used to support environmental management decisions. Confidence in the data is critical, having legal, regulatory, and therefore, economic impact. To promote quality in the planning, management, and performance of these sampling and analysis operations, DOE's Office of Environmental Management (EM) has established the National Analytical Management Program (NAMP). With a focus on reducing the estimated costs of over $200M per year for EM's analytical services, NAMP has been charged with developing products that will decrease the costs for DOE complex-wide environmental management while maintaining quality in all aspects of the analytical data generation. As part of this thrust to streamline operations, NAMP is developing centralized information systems that will allow DOE complex personnel to share information about EM contacts at the various sites, pertinent methodologies for environmental restoration and waste management, costs of analyses, and performance of contracted laboratories

  14. Laser metrology — a diagnostic tool in automotive development processes

    Science.gov (United States)

    Beeck, Manfred-Andreas; Hentschel, Werner

    2000-08-01

    Laser measurement techniques are widely used in automotive development processes. Applications at Volkswagen are presented where laser metrology works as a diagnostic tool for analysing and optimising complex coupled processes inside and between automotive components and structures such as the reduction of a vehicle's interior or outer acoustic noise, including brake noise, and the combustion analysis for diesel and gasoline engines to further reduce fuel consumption and pollution. Pulsed electronic speckle pattern interferometry (ESPI) and holographic interferometry are used for analysing the knocking behaviour of modern engines and for correct positioning of knocking sensors. Holographic interferometry shows up the vibrational behaviour of brake components and their interaction during braking, and allows optimisation for noise-free brake systems. Scanning laser vibrometry analyses structure-born noise of a whole car body for the optimisation of its interior acoustical behaviour.Modern engine combustion concepts such as in direct-injection (DI) gasoline and diesel engines benefit from laser diagnostic tools which permit deeper insight into the in-cylinder processes such as flow generation, fuel injection and spray formation, atomisation and mixing, ignition and combustion, and formation and reduction of pollutants. The necessary optical access inside a cylinder is realised by so-called 'transparent engines' allowing measurements nearly during the whole engine cycle. Measurement techniques and results on double-pulse particle image velocimetry (PIV) with a frequency-doubled YAG laser for in-cylinder flow analysis are presented, as well as Mie-scattering on droplets using a copper vapour laser combined with high-speed filming, and laser-induced fluorescence (LIF) with an excimer laser for spray and fuel vapour analysis.

  15. Developing Anticipatory Life Cycle Assessment Tools to Support Responsible Innovation

    Science.gov (United States)

    Wender, Benjamin

    Several prominent research strategy organizations recommend applying life cycle assessment (LCA) early in the development of emerging technologies. For example, the US Environmental Protection Agency, the National Research Council, the Department of Energy, and the National Nanotechnology Initiative identify the potential for LCA to inform research and development (R&D) of photovoltaics and products containing engineered nanomaterials (ENMs). In this capacity, application of LCA to emerging technologies may contribute to the growing movement for responsible research and innovation (RRI). However, existing LCA practices are largely retrospective and ill-suited to support the objectives of RRI. For example, barriers related to data availability, rapid technology change, and isolation of environmental from technical research inhibit application of LCA to developing technologies. This dissertation focuses on development of anticipatory LCA tools that incorporate elements of technology forecasting, provide robust explorations of uncertainty, and engage diverse innovation actors in overcoming retrospective approaches to environmental assessment and improvement of emerging technologies. Chapter one contextualizes current LCA practices within the growing literature articulating RRI and identifies the optimal place in the stage gate innovation model to apply LCA. Chapter one concludes with a call to develop anticipatory LCA---building on the theory of anticipatory governance---as a series of methodological improvements that seek to align LCA practices with the objectives of RRI. Chapter two provides a framework for anticipatory LCA, identifies where research from multiple disciplines informs LCA practice, and builds off the recommendations presented in the preceding chapter. Chapter two focuses on crystalline and thin film photovoltaics (PV) to illustrate the novel framework, in part because PV is an environmentally motivated technology undergoing extensive R&D efforts and

  16. Development and assessment of the Alberta Context Tool

    Directory of Open Access Journals (Sweden)

    Birdsell Judy M

    2009-12-01

    Full Text Available Abstract Background The context of healthcare organizations such as hospitals is increasingly accepted as having the potential to influence the use of new knowledge. However, the mechanisms by which the organizational context influences evidence-based practices are not well understood. Current measures of organizational context lack a theory-informed approach, lack construct clarity and generally have modest psychometric properties. This paper presents the development and initial psychometric validation of the Alberta Context Tool (ACT, an eight dimension measure of organizational context for healthcare settings. Methods Three principles guided the development of the ACT: substantive theory, brevity, and modifiability. The Promoting Action on Research Implementation in Health Services (PARiHS framework and related literature were used to guide selection of items in the ACT. The ACT was required to be brief enough to be tolerated in busy and resource stretched work settings and to assess concepts of organizational context that were potentially modifiable. The English version of the ACT was completed by 764 nurses (752 valid responses working in seven Canadian pediatric care hospitals as part of its initial validation. Cronbach's alpha, exploratory factor analysis, analysis of variance, and tests of association were used to assess instrument reliability and validity. Results Factor analysis indicated a 13-factor solution (accounting for 59.26% of the variance in 'organizational context'. The composition of the factors was similar to those originally conceptualized. Cronbach's alpha for the 13 factors ranged from .54 to .91 with 4 factors performing below the commonly accepted alpha cut off of .70. Bivariate associations between instrumental research utilization levels (which the ACT was developed to predict and the ACT's 13 factors were statistically significant at the 5% level for 12 of the 13 factors. Each factor also showed a trend of

  17. Software Development Of XML Parser Based On Algebraic Tools

    Science.gov (United States)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  18. Analytical Method Development and Validation of Related Substance Method for Bortezomib for Injection 3.5 mg/Vial by RP-HPLC Method

    Directory of Open Access Journals (Sweden)

    Utage M

    2013-04-01

    Full Text Available An accurate, precise, simple and economical High Performance Liquid Chromatographic method for therelated substance determination of Bortezomib in its lyophilized dosage form has been developed. Themethod developed is Reverse Phase High Performance Liquid Chromatographic method using HypersilBDS C18 column (Length: 150mm, Diameter: 4.6mm, Particle size: 5μ with Gradient programmed anda simple Acetonitrile, Water and Formic acid in the ratio of 30:70:0.1 (v/v/v respectively as mobilephase A and Acetonitrile, Water and Formic acid in the ratio of 80:20:0.1 (v/v/v respectively. Themethod so developed was validated in compliance with the regulatory guidelines by using welldeveloped analytical method validation tool which comprises with the analytical method validationparameters like Linearity, Accuracy, Method precision, Specificity with forced degradation, Systemsuitability, Robustness, LOD, LOQ and Ruggedness. The results obtained were well within theacceptance criteria.

  19. Development of Social Networks and Tools in TeliaSonera

    OpenAIRE

    Chandra, Shweta

    2010-01-01

    As global enterprises stretch beyond geographical boundaries and organization chart limits, collaborative glue is needed to stick everything together. With many powerful collaborative tools such as video and web conferencing, wikis, blogs and various other web 2.0 tools, the need to collaborate is clear and many companies are now investing in enterprise collaboration solutions. This study aims to find ways for the case team at TeliaSonera to use these tools and networks more effectively....

  20. Knowledge based process development of bobbin tool friction stir welding

    OpenAIRE

    Hilgert, Jakob

    2012-01-01

    Over the last twenty years Friction Stir Welding (FSW) has proven to be a very promising new joining technique. Especially high strength aluminium alloys can be welded with large advantages as compared to conventional fusion welding processes. For some joint configurations and desired applications bobbin tool welding is a process variant that can circumvent limitations arising from the high process forces in conventional tool FSW. As bobbin tools are highly mechanically loaded, in-depth under...

  1. RCC-RMX appendix A16 methodology for the analytical J calculation under thermal and combined thermal plus mechanical loadings for pipes and elbows and related assessment tool MJSAM

    International Nuclear Information System (INIS)

    RCC-MRx code provides flaw assessment methodologies and related tools for Nuclear Power Plant cracked components. An important work has been made in particular to develop a large set of compendia for the calculation of the parameter J for various components (plates, pipes, elbows,...) and various defect geometries. Also, CEA in the frame of collaborations with IRSN, developed a methodology for J analytical calculation for cracked pipes and elbows submitted to thermal and combined mechanical and thermal loadings. This paper presents first the development of this methodology and an overview of the validation strategy, based on reference 2D and 3D F.E. calculations. The second part of the paper presents the last version of the MJSAM tools which is based on the 2010 version of the appendix A16 of the RCC-MRx code. All compendia (for KI, J and C* calculation) and all defect assessment procedures have been implemented in the tool: It covers crack initiation and propagation under fatigue, creep, creep-fatigue and ductile tearing situations. Sensitivity and probabilistic analyses can also been performed with this tool, directly linked to Microsoft Excel software for the results exploitation. (authors)

  2. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    OpenAIRE

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatogr...

  3. The Bristol Radiology Report Assessment Tool (BRRAT): Developing a workplace-based assessment tool for radiology reporting skills

    International Nuclear Information System (INIS)

    Aim: To review the development of a workplace-based assessment tool to assess the quality of written radiology reports and assess its reliability, feasibility, and validity. Materials and methods: A comprehensive literature review and rigorous Delphi study enabled the development of the Bristol Radiology Report Assessment Tool (BRRAT), which consists of 19 questions and a global assessment score. Three assessors applied the assessment tool to 240 radiology reports provided by 24 radiology trainees. Results: The reliability coefficient for the 19 questions was 0.79 and the equivalent coefficient for the global assessment scores was 0.67. Generalizability coefficients demonstrate that higher numbers of assessors and assessments are needed to reach acceptable levels of reliability for summative assessments due to assessor subjectivity. Conclusion: The study methodology gives good validity and strong foundation in best-practice. The assessment tool developed for radiology reporting is reliable and most suited to formative assessments

  4. A Participatory Approach to Develop the Power Mobility Screening Tool and the Power Mobility Clinical Driving Assessment Tool

    Directory of Open Access Journals (Sweden)

    Deepan C. Kamaraj

    2014-01-01

    Full Text Available The electric powered wheelchair (EPW is an indispensable assistive device that increases participation among individuals with disabilities. However, due to lack of standardized assessment tools, developing evidence based training protocols for EPW users to improve driving skills has been a challenge. In this study, we adopt the principles of participatory research and employ qualitative methods to develop the Power Mobility Screening Tool (PMST and Power Mobility Clinical Driving Assessment (PMCDA. Qualitative data from professional experts and expert EPW users who participated in a focus group and a discussion forum were used to establish content validity of the PMCDA and the PMST. These tools collectively could assess a user’s current level of bodily function and their current EPW driving capacity. Further multicenter studies are necessary to evaluate the psychometric properties of these tests and develop EPW driving training protocols based on these assessment tools.

  5. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    Directory of Open Access Journals (Sweden)

    Dong-Kyu Lee

    2014-05-01

    Full Text Available Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different detection ranges will be produced according to the choice. In this study, four analytical tools, Fourier transform near‑infrared spectroscopy (FT-NIR, 1H-nuclear magnetic resonance spectroscopy (1H‑NMR, liquid chromatography-mass spectrometry (LC-MS, and gas chromatography-mass spectroscopy (GC-MS were applied in parallel to the same samples of two popular medicinal plants (Gastrodia elata and Rehmannia glutinosa cultivated either in Korea or China. The classification abilities of four discriminant models for each plant were evaluated based on the misclassification rate and Q2 obtained from principal component analysis (PCA and orthogonal projection to latent structures-discriminant analysis (OPLS‑DA, respectively. 1H-NMR and LC-MS, which were the best techniques for G. elata and R. glutinosa, respectively, were generally preferable for origin discrimination over the others. Reasoned by integrating all the results, 1H-NMR is the most prominent technique for discriminating the origins of two plants. Nonetheless, this study suggests that preliminary screening is essential to determine the most suitable analytical tool and statistical method, which will ensure the dependability of metabolomics-based discrimination.

  6. CRMS vegetation analytical team framework: Methods for collection, development, and use of vegetation response variables

    Science.gov (United States)

    Cretini, Kari F.; Visser, Jenneke M.; Krauss, Ken W.; Steyer, Gregory D.

    2011-01-01

    This document identifies the main objectives of the Coastwide Reference Monitoring System (CRMS) vegetation analytical team, which are to provide (1) collection and development methods for vegetation response variables and (2) the ways in which these response variables will be used to evaluate restoration project effectiveness. The vegetation parameters (that is, response variables) collected in CRMS and other coastal restoration projects funded under the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) are identified, and the field collection methods for these parameters are summarized. Existing knowledge on community and plant responses to changes in environmental drivers (for example, flooding and salinity) from published literature and from the CRMS and CWPPRA monitoring dataset are used to develop a suite of indices to assess wetland condition in coastal Louisiana. Two indices, the floristic quality index (FQI) and a productivity index, are described for herbaceous and forested vegetation. The FQI for herbaceous vegetation is tested with a long-term dataset from a CWPPRA marsh creation project. Example graphics for this index are provided and discussed. The other indices, an FQI for forest vegetation (that is, trees and shrubs) and productivity indices for herbaceous and forest vegetation, are proposed but not tested. New response variables may be added or current response variables removed as data become available and as our understanding of restoration success indicators develops. Once indices are fully developed, each will be used by the vegetation analytical team to assess and evaluate CRMS/CWPPRA project and program effectiveness. The vegetation analytical teams plan to summarize their results in the form of written reports and/or graphics and present these items to CRMS Federal and State sponsors, restoration project managers, landowners, and other data users for their input.

  7. Analytical development and optimization of a graphene-solution interface capacitance model.

    Science.gov (United States)

    Karimi, Hediyeh; Rahmani, Rasoul; Mashayekhi, Reza; Ranjbari, Leyla; Shirdel, Amir H; Haghighian, Niloofar; Movahedi, Parisa; Hadiyan, Moein; Ismail, Razali

    2014-01-01

    Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs) is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs) along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE) is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO) algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy. PMID:24991496

  8. Geochemical fingerprinting: 40 years of analytical development and real world applications

    International Nuclear Information System (INIS)

    Geochemical fingerprinting is a rapidly expanding discipline in the earth and environmental sciences. It is anchored in the recognition that geological processes leave behind chemical and isotopic patterns in the rock record. Many of these patterns, informally referred to as geochemical fingerprints, differ only in fine detail from each other. For this reason, the approach of fingerprinting requires analytical data of very high precision and accuracy. It is not surprising that the advancement of geochemical fingerprinting occurred alongside progress in geochemical analysis techniques. In this brief treatment, a subjective selection of drivers behind the analytical progress and its implications for geochemical fingerprinting are discussed. These include the impact of the Apollo lunar sample return program on quality of geochemical data and its push towards minimizing required sample volumes. The advancement of in situ analytical techniques is also identified as a major factor that has enabled geochemical fingerprinting to expand into a larger variety of fields. For real world applications of geochemical fingerprinting, in which large sample throughput, reasonable cost, and fast turnaround are key requirements, the improvements to inductively-coupled-plasma quadrupole mass spectrometry were paramount. The past 40 years have witnessed how geochemical fingerprinting has found its way into everyday applications. This development is cause for celebrating the 40 years of existence of the IAGC.

  9. Analytical development and optimization of a graphene–solution interface capacitance model

    Directory of Open Access Journals (Sweden)

    Hediyeh Karimi

    2014-05-01

    Full Text Available Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy.

  10. Integrating Fourth-Generation Tools Into the Applications Development Environment.

    Science.gov (United States)

    Litaker, R. G.; And Others

    1985-01-01

    Much of the power of the "information center" comes from its ability to effectively use fourth-generation productivity tools to provide information processing services. A case study of the use of these tools at Western Michigan University is presented. (Author/MLW)

  11. Research into cutting tool development and process of wear

    Czech Academy of Sciences Publication Activity Database

    Vašek, Jaroslav

    Gliwice: KOMAG, 2008, s. 169-183. ISBN 83-919228-5-5. [KOMTECH. International Scientific and Technical Conference/4./. Szczyrk (PL), 17.11.2003-19.11.2003] R&D Projects: GA AV ČR IAA3086201 Institutional research plan: CEZ:AV0Z3086906 Keywords : cutting tools * wear process Subject RIV: JQ - Machines ; Tools

  12. Development of micro machining tools for finishing weld joint

    International Nuclear Information System (INIS)

    GE, Hitachi and Toshiba are jointly constructing advanced boiling water reactor (ABWR) Units 6 and 7 at Kashiwazaki Kariwa Nuclear Power Plant Station, Tokyo Electric Power Co. The ABWR features enhanced operability and safety as a whole plant through simplicity and improved performance. To achieve these improvement, one of the key features of technical innovation adopted in the ABWR design, ten reactor internal pumps (RIP) are adopted as the reactor recirculation system. The RIP casing to hold the RIP constituting the primary pressure boundary together with a RPV is welded to the nozzle on a RPV lower shell with Gas Tungsten Arc Welding (GTAW). The welding is on V-groove using automatic GTAW technique from the inside of the casing. The penetration bead (the back side of the weld) therefore needs to be finished with machining tools to inspect the qualification of the welding. This paper summarizes the development of the special purpose micro machines which are installed inside the narrow gap being provided between the RIP casing and the RPV (skirt) to finish the penetration bead. (author)

  13. Developing a Standardised Tool for Impact Assessment of INFLIBNET

    Directory of Open Access Journals (Sweden)

    S. Ramesh,

    2013-11-01

    Full Text Available This paper aims to develop a standardised tool to measure the impact of INFLIBNET services.The methodology involves validation of the impact assessment scale by establishing reliability and validity. Construct validity was established initially, based on previous research findings on impact assessment. This was followed by content validity by asking opinion from three experts, who rated the scale to be highly valid, along with certain modifications. Later, Test Retest reliability was established using data from 21 respondents with an interval 75 days. ‘r’ value ranges from 0.46 – 0.49 for different sub-scales. Alpha co-efficiency was also calculated c value range between 0.54 and 0.80, Thus, the scale was found to be stable and valid and ready for identifying impact of INFLIBNET services. The paper involves application of Psychometric in the field of Library Science, emphasizing the importance of standardisation in assessment of any new resources/ service.

  14. Visual analytics for aviation safety: A collaborative approach to sensemaking

    Science.gov (United States)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  15. Rutherford backscattering spectrometry: a possible high precision analytical tool for quantifying material transfer rates between machine components in contact

    International Nuclear Information System (INIS)

    Rutherford backscattering (RBS) is an ion beam-based analytical technique commonly used for materials analysis. It offers several possibilities of which we mention elemental identification, thickness measurements of layered structures and stoichiometric determinations. The principle and the specific features of the method are presented in this paper. The main aim of our experiments was to test the possibilities of RBS in material transfer determinations. As an illustration, experimental spectra obtained on both Falex (steel-brass) and Timken (steel-aluminium) sample are presented. Advantages as well as limitations of RBS for this particular purpose are discussed. (author) 4 Figs., 7 Refs

  16. Development of Genetic Tools for the Manipulation of the Planctomycetes.

    Science.gov (United States)

    Rivas-Marín, Elena; Canosa, Inés; Santero, Eduardo; Devos, Damien P

    2016-01-01

    Bacteria belonging to the Planctomycetes, Verrucomicrobia, Chlamydiae (PVC) superphylum are of interest for biotechnology, evolutionary cell biology, ecology, and human health. Some PVC species lack a number of typical bacterial features while others possess characteristics that are usually more associated to eukaryotes or archaea. For example, the Planctomycetes phylum is atypical for the absence of the FtsZ protein and for the presence of a developed endomembrane system. Studies of the cellular and molecular biology of these infrequent characteristics are currently limited due to the lack of genetic tools for most of the species. So far, genetic manipulation in Planctomycetes has been described in Planctopirus limnophila only. Here, we show a simple approach that allows mutagenesis by homologous recombination in three different planctomycetes species (i.e., Gemmata obscuriglobus, Gimesia maris, and Blastopirellula marina), in addition to P. limnophila, thus extending the repertoire of genetically modifiable organisms in this superphylum. Although the Planctomycetes show high resistance to most antibiotics, we have used kanamycin resistance genes in G. obscuriglobus, P. limnophila, and G. maris, and tetracycline resistance genes in B. marina, as markers for mutant selection. In all cases, plasmids were introduced in the strains by mating or electroporation, and the genetic modification was verified by Southern Blotting analysis. In addition, we show that the green fluorescent protein (gfp) is expressed in all four backgrounds from an Escherichia coli promoter. The genetic manipulation achievement in four phylogenetically diverse planctomycetes will enable molecular studies in these strains, and opens the door to developing genetic approaches not only in other planctomycetes but also other species of the superphylum, such as the Lentisphaerae. PMID:27379046

  17. Development of Genetic Tools for the Manipulation of the Planctomycetes

    Science.gov (United States)

    Rivas-Marín, Elena; Canosa, Inés; Santero, Eduardo; Devos, Damien P.

    2016-01-01

    Bacteria belonging to the Planctomycetes, Verrucomicrobia, Chlamydiae (PVC) superphylum are of interest for biotechnology, evolutionary cell biology, ecology, and human health. Some PVC species lack a number of typical bacterial features while others possess characteristics that are usually more associated to eukaryotes or archaea. For example, the Planctomycetes phylum is atypical for the absence of the FtsZ protein and for the presence of a developed endomembrane system. Studies of the cellular and molecular biology of these infrequent characteristics are currently limited due to the lack of genetic tools for most of the species. So far, genetic manipulation in Planctomycetes has been described in Planctopirus limnophila only. Here, we show a simple approach that allows mutagenesis by homologous recombination in three different planctomycetes species (i.e., Gemmata obscuriglobus, Gimesia maris, and Blastopirellula marina), in addition to P. limnophila, thus extending the repertoire of genetically modifiable organisms in this superphylum. Although the Planctomycetes show high resistance to most antibiotics, we have used kanamycin resistance genes in G. obscuriglobus, P. limnophila, and G. maris, and tetracycline resistance genes in B. marina, as markers for mutant selection. In all cases, plasmids were introduced in the strains by mating or electroporation, and the genetic modification was verified by Southern Blotting analysis. In addition, we show that the green fluorescent protein (gfp) is expressed in all four backgrounds from an Escherichia coli promoter. The genetic manipulation achievement in four phylogenetically diverse planctomycetes will enable molecular studies in these strains, and opens the door to developing genetic approaches not only in other planctomycetes but also other species of the superphylum, such as the Lentisphaerae. PMID:27379046

  18. Big Data Analytics in Healthcare

    OpenAIRE

    Ashwin Belle; Raghuram Thiagarajan; S. M. Reza Soroushmehr; Fatemeh Navidi; Daniel A Beard; Kayvan Najarian

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is sti...

  19. An Approach to Building a Traceability Tool for Software Development

    Science.gov (United States)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    specifications, design reports, and system code. Tracing helps 1) validate system features against, the requirement specification, 2) identify error sources and, most importantly, 3) manage change. With so many people involved in the development of the system, it becomes necessary to identify the reasons behind the design requirements or the implementation decisions. This paper is concerned with an approach that maps documents to constraints that capture properties of and relationships between the objects being modeled by the program. Section 2 provides the reader with a background on traceability tools. Section 3 gives a brief description of the context monitoring system on which the approach suggested in this paper is based. Section 4 presents an overview of our approach to providing traceability. The last section presents our future direction of research.

  20. Forming and actualization of cognitive motives as means for development of students' analytical thinking.

    Directory of Open Access Journals (Sweden)

    Shevchenko Svetlana Nikolaevna

    2011-10-01

    Full Text Available Considered different approaches to understanding the concepts of motivation and motive. Species analyzed motives of educational activity. Established that cognitive motives are most effective for the development of analytical thinking of students. The study used data from test 1-4 grade students. An interconnection between the level of academic achievement and student motivation level of its training. Isolated areas of forming and maintaining cognitive motives of students in the learning process. It is established that the formation and activation of the cognitive motivation of students affected: the content of educational material, organizing training activities, style of teaching. Each component provides the motivational aspect of students to study.

  1. Development of an analytical model to assess fuel property effects on combustor performance

    Science.gov (United States)

    Sutton, R. D.; Troth, D. L.; Miles, G. A.; Riddlebaugh, S. M.

    1987-01-01

    A generalized first-order computer model has been developed in order to analytically evaluate the potential effect of alternative fuels' effects on gas turbine combustors. The model assesses the size, configuration, combustion reliability, and durability of the combustors required to meet performance and emission standards while operating on a broad range of fuels. Predictions predicated on combustor flow-field determinations by the model indicate that fuel chemistry, as defined by hydrogen content, exerts a significant influence on flame retardation, liner wall temperature, and smoke emission.

  2. Wind flow characteristics in the wakes of large wind turbines. Volume 1: Analytical model development

    Science.gov (United States)

    Eberle, W. R.

    1981-01-01

    A computer program to calculate the wake downwind of a wind turbine was developed. Turbine wake characteristics are useful for determining optimum arrays for wind turbine farms. The analytical model is based on the characteristics of a turbulent coflowing jet with modification for the effects of atmospheric turbulence. The program calculates overall wake characteristics, wind profiles, and power recovery for a wind turbine directly in the wake of another turbine, as functions of distance downwind of the turbine. The calculation procedure is described in detail, and sample results are presented to illustrate the general behavior of the wake and the effects of principal input parameters.

  3. Selection, Development and Results for The RESOLVE Regolith Volatiles Characterization Analytical System

    Science.gov (United States)

    Lueck, Dale E.; Captain, Janine E.; Gibson, Tracy L.; Peterson, Barbara V.; Berger, Cristina M.; Levine, Lanfang

    2008-01-01

    The RESOLVE project requires an analytical system to identify and quantitate the volatiles released from a lunar drill core sample as it is crushed and heated to 150 C. The expected gases and their range of concentrations were used to assess Gas Chromatography (GC) and Mass Spectrometry (MS), along with specific analyzers for use on this potential lunar lander. The ability of these systems to accurately quantitate water and hydrogen in an unknown matrix led to the selection of a small MEMS commercial process GC for use in this project. The modification, development and testing of this instrument for the specific needs of the project is covered.

  4. Development of an analytical method for estimating the composition of NOx gas using ion chromatography

    International Nuclear Information System (INIS)

    The choice of solvent for reprocessing the spent nuclear fuel by aqueous route is nitric acid. Hence the presence of NOx gas in all the off-gas streams is inevitable. Estimation of the composition of these gases is very important to evaluate the reaction mechanism of the dissolution step. This article briefly explains an analytical method developed for estimating the composition of NOx gas by ion chromatography during the reaction between sodium nitrate and nitric acid which can be extended for reprocessing applications in the PUREX dissolver system with necessary changes. (author)

  5. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied for...

  6. Progress report on the development of remotely operated tools

    International Nuclear Information System (INIS)

    This report contains a number of individual trials reports based upon work conducted in aid of a programme of feasibility studies into the size reduction of radioactive contaminated solid waste. The work was directed towards the identification of acceptable remotely operated tools and the means of deploying them for dismantling operations in a radioactive environment. Reliability, ease of maintenance, change of tool bits and common power sources have been major considerations in the trials assessments. Alternative end effector drive systems have also been considered when defining suitable manipulative capabilities and attention has also been directed towards a remotely controlled tool changing capability. (author)

  7. Determination of thin noble metal layers using laser ablation ICP-MS: An analytical tool for NobleChem technology

    International Nuclear Information System (INIS)

    Intergranular stress corrosion cracking (SCC) of reactor internals and recirculation piping is a matter of concern in boiling water reactors (BWR). SCC is basically an anodic dissolution of the metal grain boundaries if these are susceptible either because of the failure to stress relieve welds in un-stabilized steel where the grain boundaries become depleted in chromium, or under irradiation where migration of chromium and other impurities away from or to the grain boundaries renders them sensitive to dissolution. To mitigate SCC, the electrochemical corrosion potential (ECP) of the structural materials in the BWR environment needs to be lowered 2 and H2O2 by the injection of a sufficiently large amount of H2 to the feedwater. This technique can be very effective, but it has the undesirable side effect of increasing the radiation level in the main steam by a factor of 4 to 5. NobleChem has been developed and patented by General Electric Company and is a more effective method of achieving a low ECP value at lower hydrogen injection rates without negative side effects of HWC. In this process noble metals (Pt, Rh) are injected into the feedwater (typically during the reactor shut-down), which then deposit on the structural component surfaces and on fuel. Noble metals are electrocatalysts that efficiently recombine O2 and H2O2 with H2 on the metal surface. With NobleChem/Low HWC, the component surface oxidant concentration becomes zero as soon as the bulk reactor water reaches a stoichiometric excess hydrogen condition. The SCC mitigation effectiveness of NobleChem is crucially dependent on achieving a sufficiently high noble metal concentration of ca. 0.1 μg/cm2 on the critical component and crack flank surfaces. In order to study and understand the transport, (re-)distribution and deposition behaviour of the noble metals in the reactor coolant circuit and to control the SCC mitigation effectiveness of NobleChem, analytical methods determining the local Pt and Rh

  8. Development of Time Management Tools for Project Oriented Engineering Education

    OpenAIRE

    Fabrés i Anglí, Josep

    2008-01-01

    This thesis wants to adapt the project time management tools and techniques in the project oriented engineering education. First of all it is studied the importance of the project time management beginning from the book “A Guide to the Project Management Body of Knowledge” (PMBOK® Guide). It is also investigated the increasing implementation of the projects oriented engineering education in the universities. To adapt the project time management tools and techniques, there are defined the s...

  9. Towards the Development of Web-based Business intelligence Tools

    DEFF Research Database (Denmark)

    Georgiev, Lachezar; Tanev, Stoyan

    2011-01-01

    This paper focuses on using web search techniques in examining the co-creation strategies of technology driven firms. It does not focus on the co-creation results but describes the implementation of a software tool using data mining techniques to analyze the content on firms’ websites. The tool...... was used in other research projects to identify the ways firms engage in value co-creation with customers....

  10. [Splitting into two lines: The historical development of the analytical and the gas ultracentrifuge].

    Science.gov (United States)

    Helmbold, Bernd; Forstner, Christian

    2015-12-01

    In a historical perspective the ultracentrifuge is often taken as perfect example of a research technology according to Shinn and Joerges (Shinn and Joerges 2000, 2002). Research technologies are defined by a generic device, its own metrology and the interstitiality of the historical actors connected with the device. In our paper we give a detailed analysis of the development of the ultracentrifuge and thereby reveal two different lines of development: analytical ultracentrifuges and gas ultra centrifuges used for isotope separation. Surprisingly, we could not find any interstitial and transversal connections for these two lines. The lines end up with two different devices based on two different technical concepts. Moreover, the great majority of the actors stick to one line. These results are in accordance withother authors, who developed the concept of research technologies further and tried to sharpen their definition. PMID:26572680

  11. Reality of LOG analytical tool of ATM teller machines movement actions%ATM取款机机芯动作LOG文件解析工具的实现

    Institute of Scientific and Technical Information of China (English)

    何惠英; 李纪红; 俞妍; 沈虹

    2013-01-01

    To reduce the demands to general maintenance personnels for bank self-service teller machines, it provides a log files analytical tool of ATM teller machine in Chinese based on Python language. By analyzing data in nearly 20,000 machines of 500G capacity with the analytical tool based on the method mentioned in this article, it can concluded that by using this method , the operating performance of the machines can be obtained fastly, accurately and detailedly, and machine failures can be excluded easily.%文中基于降低对银行自助取款机一般维护人员要求的目的,采用Python编程语言编写了取款机机芯运行记录文件的中文解析工具.通过近两万台500G容量的生产环境下机器运行数据分析,得出采用本文所述方法编写的解析工具可以快速、准确、详细地获得机器的运行性能并能据此排除机器故障的结论.

  12. Bayesian meta-analytical methods to incorporate multiple surrogate endpoints in drug development process.

    Science.gov (United States)

    Bujkiewicz, Sylwia; Thompson, John R; Riley, Richard D; Abrams, Keith R

    2016-03-30

    A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression. PMID:26530518

  13. Development of an analytical scheme for the determination of pyrethroid pesticides in composite diet samples.

    Science.gov (United States)

    Vonderheide, Anne P; Kauffman, Peter E; Hieber, Thomas E; Brisbin, Judith A; Melnyk, Lisa Jo; Morgan, Jeffrey N

    2009-03-25

    Analysis of an individual's total daily food intake may be used to determine aggregate dietary ingestion of given compounds. However, the resulting composite sample represents a complex mixture, and measurement of such can often prove to be difficult. In this work, an analytical scheme was developed for the determination of 12 select pyrethroid pesticides in dietary samples. In the first phase of the study, several cleanup steps were investigated for their effectiveness in removing interferences in samples with a range of fat content (1-10%). Food samples were homogenized in the laboratory, and preparatory techniques were evaluated through recoveries from fortified samples. The selected final procedure consisted of a lyophilization step prior to sample extraction. A sequential 2-fold cleanup procedure of the extract included diatomaceous earth for removal of lipid components followed with a combination of deactivated alumina and C(18) for the simultaneous removal of polar and nonpolar interferences. Recoveries from fortified composite diet samples (10 microg kg(-1)) ranged from 50.2 to 147%. In the second phase of this work, three instrumental techniques [gas chromatography-microelectron capture detection (GC-microECD), GC-quadrupole mass spectrometry (GC-quadrupole-MS), and GC-ion trap-MS/MS] were compared for greatest sensitivity. GC-quadrupole-MS operated in selective ion monitoring (SIM) mode proved to be most sensitive, yielding method detection limits of approximately 1 microg kg(-1). The developed extraction/instrumental scheme was applied to samples collected in an exposure measurement field study. The samples were fortified and analyte recoveries were acceptable (75.9-125%); however, compounds coextracted from the food matrix prevented quantitation of four of the pyrethroid analytes in two of the samples considered. PMID:19292459

  14. Modeling decision making as a support tool for policy making on renewable energy development

    International Nuclear Information System (INIS)

    This paper presents the findings of a study on decision making models for the analysis of capital-risk investors’ preferences on biomass power plants projects. The aim of the work is to improve the support tools for policy makers in the field of renewable energy development. Analytic Network Process (ANP) helps to better understand capital-risk investors preferences towards different kinds of biomass fueled power plants. The results of the research allow public administration to better foresee the investors’ reaction to the incentive system, or to modify the incentive system to better drive investors’ decisions. Changing the incentive system is seen as major risk by investors. Therefore, public administration must design better and longer-term incentive systems, forecasting market reactions. For that, two scenarios have been designed, one showing a typical decision making process and another proposing an improved decision making scenario. A case study conducted in Italy has revealed that ANP allows understanding how capital-risk investors interpret the situation and make decisions when investing on biomass power plants; the differences between the interests of public administrations’s and promoters’, how decision making could be influenced by adding new decision criteria, and which case would be ranked best according to the decision models. - Highlights: • We applied ANP to the investors’ preferences on biomass power plants projects. • The aim is to improve the advising tools for renewable energy policy making. • A case study has been carried out with the help of two experts. • We designed two scenarios: decision making as it is and how could it be improved. • Results prove ANP is a fruitful tool enhancing participation and transparency

  15. Development of a multi-residue analytical method for TBBP-A and PBDEs in various biological matrices using unique reduced size sample

    Energy Technology Data Exchange (ETDEWEB)

    Andre, F.; Cariou, R.; Antignac, J.P.; Le Bizec, B. [Ecole Nationale Veterinaire de Nantes (FR). Laboratoire d' Etudes des Residus et Contaminants dans les Aliments (LABERCA); Debrauwer, L.; Zalko, D. [Institut National de Recherches Agronomiques (INRA), 31-Toulouse (France). UMR 1089 Xenobiotiques

    2004-09-15

    The impact of brominated flame retardants on the environment and their potential risk for animal and human health is a present time concern for the scientific community. Numerous studies related to the detection of tetrabromobisphenol A (TBBP-A) and polybrominated diphenylethers (PBDEs) have been developed over the last few years; they were mainly based on GC-ECD, GC-NCI-MS or GC-EI-HRMS, and recently GC-EI-MS/MS. The sample treatment is usually derived from the analytical methods used for dioxins, but recently some authors proposed the utilisation of solid phase extraction (SPE) cartridges. In this study, a new analytical strategy is presented for the multi-residue analysis of TBBP-A and PBDEs from a unique reduced size sample. The main objective of this analytical development is to be applied for background exposure assessment of French population groups to brominated flame retardants, for which, to our knowledge, no data exist. A second objective is to provide an efficient analytical tool to study the transfer of these contaminants through the environment to living organisms, including degradation reactions and metabolic biotransformations.

  16. Development and validation of analytical methodology with focus on the qualification of powder mixers

    Directory of Open Access Journals (Sweden)

    Pedro de Freitas Fiorante

    2015-06-01

    Full Text Available This study aims at developing an analytical procedure capable of quantifying the ferric oxide present in the mixture of ferric oxide/lactose monohydrate (0.4% w/w. The analytical procedure was checked for specificity, linearity, precision (system repeatability, procedure repeatability and intermediate precision, accuracy, stability of solutions and robustness of the procedure. The concentration of Fe (III was determined by spectrophotometry at 480 nm based on calibration curves. The specificity was verified. The linearity was obtained in the range of 11.2 to 16.8 µg of ferric oxide/mL. The relative standard deviation (RSD of the system repeatability, procedure repeatability and intermediate precision, were not more than 2%. The RSD of the accuracy values were less than 0.75%. The stability of the samples was checked over a 24 hours assay. In the robustness evaluation, the wavelength and the concentration of hydrochloric acid varied. The maximum absorbance deviation due to wavelength variation was 0.14 percent, and the maximum deviation due to the hydrochloric acid concentration variation was 2.4%, indicating that the concentration of hydrochloric acid is critical to the analysis of ferric oxide. The procedure developed was validated and is suitable to the performance qualification of powder mixers.

  17. Conceptual air sparging decision tool in support of the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The enclosed document describes a conceptual decision tool (hereinafter, Tool) for determining applicability of and for optimizing air sparging systems. The Tool was developed by a multi-disciplinary team of internationally recognized experts in air sparging technology, lead by a group of project and task managers at Parsons Engineering Science, Inc. (Parsons ES). The team included Mr. Douglas Downey and Dr. Robert Hinchee of Parsons ES, Dr. Paul Johnson of Arizona State University, Dr. Richard Johnson of Oregon Graduate Institute, and Mr. Michael Marley of Envirogen, Inc. User Community Panel Review was coordinated by Dr. Robert Siegrist of Colorado School of Mines (also of Oak Ridge National Laboratory) and Dr. Thomas Brouns of Battelle/Pacific Northwest Laboratory. The Tool is intended to provide guidance to field practitioners and environmental managers for evaluating the applicability and optimization of air sparging as remedial action technique.

  18. Development Of The Flight Crew Human Factors Integration Tool

    OpenAIRE

    Gosling, Geoffrey D.; Roberts, Karlene H.

    1998-01-01

    In May 1996, the FAA announced a new and innovative approach to reach a goal of "zero accidents," known as the Global Analysis and Information Network (GAIN). This would be a privately owned and operated international information infrastructure for the collection, analysis, and dissemination of aviation safety information. It would involve the use of a broad variety of worldwide aviation data sources, coupled with comprehensive analytical techniques, to facilitate the identification of existi...

  19. Development of analytical technique of alteration minerals formed in bentonite by the reaction with alkaline solution

    International Nuclear Information System (INIS)

    Bentonite will be used as a buffer material, according to the TRU waste disposal concept in Japan, to retard radionuclides migration, to restrict seepage of ground water and to filtrate colloids. One of the concern about the buffer material is the long term alteration of bentonite with cementitious material. Long term alteration of bentonite-based materials with alkaline solution has been studied by means of analytical approaches, coupling mass transport and chemical reactions, which suggest changes in various properties of buffer materials. Long term performance assessment of engineered barriers under disposal conditions is important to achieve a reasonable design, eliminating excessive conservatism in the safety assessment. Therefore it is essential for improving the reliance of the performance assessment to verify the analytical results through alteration tests and/or natural analogue. The geochemical analyses indicate that major alteration reactions involve dissolution of portlandite, chalcedony and montmorillonite and formation of C-S-H gel and analcime at the interface between cement and bentonite. However, in the alteration tests assuming interaction between bentonite and cement, secondary minerals due to alteration under the expected condition for geological disposal (equilibrated water with cement at low liquid/solid ratio) had not been observed, though the alteration was observed under accelerated hyper alkaline and high temperatures conditions. The reason is considered that it is difficult to analyze C-S-H gel formed at the interface because of its small quantity. One of examples is the Kunigel V1, a potential buffer material in Japan, which consists of montmorillonite, chalcedony, plagioclase, and calcite. In the XRD analysis of the Kunigel V 1, the locations of the primary peak of the calcite and that of the C-S-H gel overlap, which makes identification of small quantity of C-S-H gel formed as a secondary mineral difficult. Thus development of

  20. USGS ShakeMap Developments, Implementation, and Derivative Tools

    Science.gov (United States)

    Wald, D. J.; Lin, K.; Quitoriano, V.; Worden, B.

    2007-12-01

    ) system. These global ShakeMaps are constrained by rapidly gathered intensity data via the Internet and by finite fault and aftershock analyses for portraying fault rupture dimensions. As part of the PAGER loss calibration process we have produced an Atlas of ShakeMaps for significant earthquakes around the globe since 1973 (Allen and others, this Session); these Atlas events have additional constraints provided by archival strong motion, faulting dimensions, and macroseismic intensity data. We also describe derivative tools for further utilizing ShakeMap including ShakeCast, a fully automated system for delivering specific ShakeMap products to critical users and triggering established post-earthquake response protocols. We have released ShakeCast Version 2.0 (Lin and others, this Session), which allows RSS feeds for automatically receiving ShakeMap files, auto-launching of post-download processing scripts, and delivering notifications based on users' likely facility damage states derived from ShakeMap shaking parameters. As part of our efforts to produce estimated ShakeMaps globally, we have developed a procedure for deriving Vs30 estimates from correlations with topographic slope, and we have now implemented a global Vs30 Server, allowing users to generate Vs30 maps for custom user-selected regions around the globe (Allen and Wald, this Session). Finally, as a further derivative product of the ShakeMap Atlas project, we will present a shaking hazard Map for the past 30 years based on approximately 3,900 earthquake ShakeMaps of historic earthquakes.