WorldWideScience

Sample records for analytical tool development

  1. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  2. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...

  3. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    Science.gov (United States)

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  4. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  5. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  6. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  7. Analytical Hierarchy Process for Developing a Building Performance-Risk Rating Tool

    Directory of Open Access Journals (Sweden)

    Khalil Natasha

    2016-01-01

    Full Text Available The need to optimize the performance of buildings has increased consequently due to the expansive supply of facilities in higher education building (HEB. Proper performance assessment as a proactive measure may help university building in achieving performance optimization. However, the current maintenance programs or performance evaluation in the HEB is a systemic and cyclic process where maintenance is considered as an operational issue and not as opposed to a strategic issue. Hence, this paper proposed a Building Performance Risk Rating Tool (BPRT as an improved measure for building performance evaluation by addressing the users' risk in health and safety aspects. The BPRT is developed from the result of a rating index using the Analytical Hierarchy Process (AHP method. 12 facilities management (FM experts and practitioners were involved in the rating process. The subjective weightings are analysed using the AHP computer software, the Expert Choice 11. The establishment of the BPRT was introduced as an aid of improvement towards the current performance assessment of HEB by emerging the concept of building performance and risk into a numerical strategic approach

  8. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    Science.gov (United States)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  9. Use of analytic hierarchy process (AHP) as an instrument to develop a solid waste management assessment tool

    OpenAIRE

    Batagarawa, Rabia; Williams, John Barry; Potts, Jonathan Stephenson; Brown, Julia Catherine

    2015-01-01

    The aim of this paper is to evaluate the feasibility of Analytic Hierarchy Process (AHP) as a data collection instrument in developing a solid waste management assessment tool. AHP is a quantifying tool that provides an effective and precise means of choosing options evident in many disciplines such as waste management where priority scales measure elements in relative terms. The procedure is performed using Expert Choice software. A structured questionnaire survey was employed to obtain data...

  10. Development of integrated analytical tools for level-2 PSA of LMFBR

    International Nuclear Information System (INIS)

    As same as to light water reactor, JNES (Japan Nuclear Energy Safety Organization) has devoted to prepare the analysis tools for PSA to liquid-metal cooled fast breeder reactor (LMFBR) to make safety evaluation from regulatory side. The developed tools consist of a group of safety analysis computer codes and an analysis method called PRD (Phenomenological Relationship Diagram) to qualify logically the probability distribution at the branching points in event trees. So far the tools have been used to evaluate the effectiveness of accident management measures of Monju proposed by the owner and the tools are under further development to describe the event progresses more realistically. One of the objectives of this improvement is to construct data bases of the Emergency Response Support System (ERSS) for Monju by conducting many application analyses to the conceivable scenarios after initiating events. The present paper introduces the function of each tool in the synthetic analysis system coupled with the accident scenario and presents points for future improvement. The phase transitions of severe accidents of LMFBR and the role of each analysis tool is shown. In (i) the plant response phase, the temperature of sodium in the primary cooling system begins to rise due to the power to flow mismatch. In cases of gradual temperature increase such as PLOHS (protected loss-of-heat sink), the sodium boundary will fail by the high temperature creep. If boundary failure does not occur,the sodium will lastly boil. The temperature and the pressure changes during the plant response phase are analyzed by the NALAP-II code. NALAP-II also calculates the SCDF (structural cumulative defect factor), that is an index of high temperate creep, of the key locations in the plant, however, the application is limited to the parts whose geometry are modeled by a cylindrical wall. Hence, for the analysis of components with complicated shape that require the consideration of buckling, structure

  11. Development of Integrated Analytical Tools for Level-2 PSA of LMFBR

    International Nuclear Information System (INIS)

    JNES has developed own safety analysis methods for LMFBR to make safety analyses independently from the applicant to support the regulatory body. The area of these computer codes covers the plant response phase, the core disruption phase and the containment vessel response phase of severe accidents. In addition to the codes, the PRD (Phenomenological Relationship Diagram) method was figured out as a logical method to identify the probability distributions of blanching points in event trees for level-2 PSA. After validation of these codes using various experimental data and many trial calculations to actual reactor system, the prepared tools were applied to the level-2 PSA of Monju to evaluate the effectiveness of accident management measures of Monju. (author)

  12. On the Design, Development and Use of the Social Data Analytics Tool (SODATO)

    DEFF Research Database (Denmark)

    Hussain, Abid

    Science, Computational Social Science and Information Systems, the PhD project addressed two general research questions about the technological architectures and design principles for big social data analytics in an organisational context. The PhD project is grounded in the theory of socio......-technical interactions for better understanding perception of, and action on, the screen when individuals use social media platforms such as Facebook. Based on the theory of socio-technical interactions, a conceptual model of social data was generated. This conceptual model of social data model consists of two...... consists of the communicative and linguistic aspects of the social media interaction such as the topics discussed, keywords mentioned, pronouns used and sentiments expressed. The conceptual model of social data is then used to specify the formal model of social data using the mathematics of set theory...

  13. An analytical model for resistivity tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1991-04-01

    An analytical model for resistivity tools is developed. It takes into account the effect of the borehole and the actual shape of the electrodes. The model is two-dimensional, i.e. the model does not deal with eccentricity. The electrical potential around a current source satisfies Poisson`s equation. The method used here to solve Poisson`s equation is the expansion fo the potential function in terms of a complete set of functions involving one of the coordinates with coefficients which are undetermined functions of the other coordinate. Numerical examples of the use of the model are presented. The results are compared with results given in the literature. (au).

  14. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  15. Analytic tools for information warfare

    Energy Technology Data Exchange (ETDEWEB)

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  16. Internet promotion tools and techniques: analytical review

    Directory of Open Access Journals (Sweden)

    S.M. Illiashenko

    2015-09-01

    Full Text Available The aim of the article. The aim of the article is an analysis and systematization of modern communication Internet marketing tools, development of recommendations for their management to promote products in a virtual environment and maintaining the highest level of communication with their economic partners and contact groups. The results of the analysis. The systematic analysis and systematization of the known Internet marketing tools were made. Authors divide them into 8 categories of the use functionality: Search Engine Marketing, Internet advertising, Social Relationship Marketing, Viral Marketing, Video Marketing, E-mail Marketing, Innovative Marketing and Analytical Marketing. The recommendations for this tools use by various size companies were proposed and the most popular Internet-instruments for products promotion were noted. By the results of analysis, the communication instruments of Internet-marketing are divided into 4 groups, which are closely interrelated. Their complex use leads to synergistic effect that appears at profit growth, consumer interest and creating of company’s positive image. Today the forgotten method of communication – E-mail Marketing, interactive infographics, communications in the form of stories, Marketing in social networks and Analytical Marketing have acquired unexpected development. These instruments satisfy needs of companies (the possibility of solid presentation, active communication link and its precise measurements and consumers (interesting content, supported by visual image and information on request. Conclusions and directions for future research. The results can be used as methodological assistance in choosing rational sets of Internet marketing instruments that would take into account the specificity of a production company (seller and its products, market, target audience. The future research must be directed to detection of inexpensive but effective Internet-communication tools, detection

  17. Aptamers: molecular tools for analytical applications.

    Science.gov (United States)

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review. PMID:17581746

  18. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  19. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  20. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  1. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  2. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  3. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  5. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  7. ‘Slag_Fun’ – A New Tool for Archaeometallurgy: Development of an Analytical (PED-XRF Method for Iron-Rich Materials

    Directory of Open Access Journals (Sweden)

    Harald Alexander Veldhuijzen

    2003-11-01

    Full Text Available This paper describes the development of a new analytical tool for bulk chemical analysis of iron-rich archaeometallurgical remains by Polarising Energy Dispersive X-ray Fluorescence ((PED-XRF. Prompted by the ongoing archaeological and archaeometric analyses of early first millennium BC iron smelting and smithing finds from Tell Hammeh (az-Zarqa, Jordan, the creation of this tool has already benefited several studies on iron-rich slag, of widely varying provenance as well as age (Anguilano 2002; Chirikure 2002; Ige and Rehren 2003; Stanway 2003. Following an explanation of the archaeological background and importance of the Hammeh finds, the paper describes the technical foundations of XRF analysis and the design, development and application of the "slag_fun" calibration method.

  8. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  9. DSAT: Data Storage and Analytics Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The aim of this project is the development a large data warehousing and analysis tool for air traffic management (ATM) research that can be accessed by users...

  10. Analytical and Decision Support Tools for Genomics-Assisted Breeding

    OpenAIRE

    Varshney, Rajeev K.; Singh, Vikas K; Hickey, John M.; Xun, Xu; Marshall, David F; Wang, Jun; Edwards, David; Ribaut, Jean-Marcel

    2016-01-01

    To successfully implement genomics-assisted breeding (GAB) in crop improvement programs, efficient and effective analytical and decision support tools (ADSTs) are 'must haves' to evaluate and select plants for developing next-generation crops. Here we review the applications and deployment of appropriate ADSTs for GAB, in the context of next-generation sequencing (NGS), an emerging source of massive genomic information. We discuss suitable software tools and pipelines for marker-based approac...

  11. Internet promotion tools and techniques: analytical review

    OpenAIRE

    S.M. Illiashenko; T.Ye. Ivanova

    2015-01-01

    The aim of the article. The aim of the article is an analysis and systematization of modern communication Internet marketing tools, development of recommendations for their management to promote products in a virtual environment and maintaining the highest level of communication with their economic partners and contact groups. The results of the analysis. The systematic analysis and systematization of the known Internet marketing tools were made. Authors divide them into 8 categories of th...

  12. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  13. Analytical tool development for coarse break-up of a molten jet in a deep water pool

    Energy Technology Data Exchange (ETDEWEB)

    Moriyama, Kiyofumi [Thermohydraulic Safety Research Group, Japan Atomic Energy Agency (JAEA), 2-4 Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan)]. E-mail: moriyama.kiyofumi@jaea.go.jp; Nakamura, Hideo [Thermohydraulic Safety Research Group, Japan Atomic Energy Agency (JAEA), 2-4 Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan); Maruyama, Yu [Thermohydraulic Safety Research Group, Japan Atomic Energy Agency (JAEA), 2-4 Shirakata-shirane, Tokai-mura, Naka-gun, Ibaraki-ken 319-1195 (Japan)

    2006-10-15

    A computer code JASMINE-pre was developed for the prediction of premixing conditions of fuel-coolant interactions and debris bed formation behavior relevant to severe accidents of light water reactors. In JASMINE-pre code, a melt model which consists of three components of sub-models for melt jet, melt particles and melt pool, is coupled with a two-phase flow model derived from ACE-3D code developed at JAERI. The melt jet and melt pool models are one-dimensional representations of a molten core stream falling into a water pool and a continuous melt body agglomerated on the bottom, respectively. The melt particles generated by the melt jet break-up are modeled based on a Lagrangian grouped particle concept. Additionally, a simplified model pmjet was developed which considers only steady state break-up of the melt jet, cooling and settlement of particles in a stationary water pool. The FARO corium quenching experiments with a saturation temperature water pool and a subcooled water pool were simulated with JASMINE-pre and pmjet. JASMINE-pre reproduced the pressurization and fragmentation behavior observed in the experiments with a reasonable accuracy. Also, the influences of model parameters on the pressurization and fragmentation were examined. The calculation results showed a quasi-steady state phase of melt jet break-up during which the amount of molten mass contained in the premixture was kept almost constant, and the steady state molten premixed masses evaluated by JASMINE-pre and pmjet agreed well.

  14. Medical text analytics tools for search and classification.

    Science.gov (United States)

    Huang, Jimmy; An, Aijun; Hu, Vivian; Tu, Karen

    2009-01-01

    A text-analytic tool has been developed that accepts clinical medical data as input in order to produce patient details. The integrated tool has the following four characteristics. 1) It has a graphical user interface. 2) It has a free-text search tool that is designed to retrieve records using keywords such as "MI" for myocardial infarction. The result set is a display of those sentences in the medical records that contain the keywords. 3) It has three tools to classify patients based on the likelihood of being diagnosed for myocardial infarction, hypertension, or their smoking status. 4) A summary is generated for each patient selected. Large medical data sets provided by the Institute for Clinical Evaluative Sciences were used during the project.

  15. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Hendrik, Drachsler; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  16. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    OpenAIRE

    Mihaela-Laura IVAN; Mircea Raducu TRIFU; Manole VELICANU; Cristian CIUREA

    2016-01-01

    The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In deta...

  17. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  18. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  19. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  20. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Directory of Open Access Journals (Sweden)

    Mihaela-Laura IVAN

    2016-05-01

    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.

  1. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  2. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    Science.gov (United States)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  3. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  4. FUMAC-84. A hybrid PCI analytical tool

    International Nuclear Information System (INIS)

    ''FUMAC-84'', a new computer code currently under development at Babcock and Wilcox, will be used to analyze PCMI in light water reactor fuel rods. This is a hybrid code in the sense that the pellet behaviour is predicted from deterministic models which incorporate the large data base being generated by the international fuel performance programs (OVERRAMP, SUPER-RAMP, NFIR, etc.), while the cladding is modelled using finite elements. The fuel cracking and relocation model developed for FUMAC is semi-empirical and includes data up to 35 GWd/mtU and linear heat rates ranging from 100 to 700 W/Cm. With this model the onset of cladding ridging has been accurately predicted for steady-state operation. Transient behaviour of the pellet is still under investigation and the model is being enhanced to include these effects. The cladding model integrates the mechanical damage over a power history by solving the finite element assumed displacement problem in a quasistatic manner. Early work on FUMAC-84 has been directed at the development and benchmarking of the interim code. The purpose of the interim code is to provide a vehicle to proof out the deterministic pellet models which have been developed. To date the cracking model and the relocation model have been benchmarked. The thermal model for the pellet was developed by fitting data from several Halden experiments. The ability to accurately predict cladding ridging behaviour has been used to test how well the pellet swelling, densification and compliance models work in conjunction with fuel cladding material models. Reasonable results have been achieved for the steady-state cases while difficulty has been encountered in trying to reproduce transient results. Current work includes an effort to improve the ability of the models to handle transients well. (author)

  5. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  6. Ultrafast 2D NMR: An Emerging Tool in Analytical Spectroscopy

    Science.gov (United States)

    Giraudeau, Patrick; Frydman, Lucio

    2014-06-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry—from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications.

  7. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  8. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    Science.gov (United States)

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  9. Learning analytics: drivers, developments and challenges

    OpenAIRE

    Ferguson, Rebecca

    2012-01-01

    Learning analytics is a significant area of technology-enhanced learning that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence ...

  10. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  11. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  12. Process analytical technology (PAT) tools for the cultivation step in biopharmaceutical production

    NARCIS (Netherlands)

    Streefland, M.; Martens, D.E.; Beuvery, E.C.; Wijffels, R.H.

    2013-01-01

    The process analytical technology (PAT) initiative is now 10 years old. This has resulted in the development of many tools and software packages dedicated to PAT application on pharmaceutical processes. However, most applications are restricted to small molecule drugs, mainly for the relatively simp

  13. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  14. TNO monitoring plan development tool

    NARCIS (Netherlands)

    Sijacic, D.; Wildenborg, T.; Steeghs, P.

    2014-01-01

    TNO has developed a software tool that supports the design of a risk-based monitoring plan for a CO2 storage site. The purpose of the tool is to aid storage site operators by facilitating a structured monitoring technologies selection or evaluation process. The tool makes a selection this recommende

  15. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Young; Kim, Eung Soo [Seoul National University, Seoul (Korea, Republic of)

    2014-10-15

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification.

  16. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... of cancer, improve patients’ chances of recovery, and limit the physical, psychological and social side-effects of cancer. The website is the main channel for communication and knowledge sharing with patients, their relatives and professionals. The present study consists of two independent analyses, one...... using Google analytics focusing on searching and browsing activities, another using a home-grown transaction log developed to collect data about tagging, searching and browsing by tags. The log is set up to distinguish between tags added by editors and end-users respectively. Altogether, the study...

  17. New analytical tools combining gel electrophoresis and mass spectrometry

    OpenAIRE

    Tobolkina, Elena

    2014-01-01

    Proteomics has been one of the main projects challenging biological and analytical chemists for many years. The separation, identification and quantification of all the proteins expressed within biological systems remain the main objectives of proteomics. Due to sample complexity, the development of fractionation, separation, purification and detection techniques that possess appropriate resolution to separate a large number of proteins, as well as being sensitive and fast enough for high thr...

  18. Development of Nuclear Analytical Technology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Joon; Kim, J. Y.; Sohn, S. C. (and others)

    2007-06-15

    The pre-treatment and handling techniques for the micro-particles in swipe samples were developed for the safeguards purpose. The development of screening technique for the swipe samples has been established using the nuclear fission track method as well as the alpha track method. The laser ablation system to take a nuclear particle present in swipe was designed and constructed for the determination of the enrichment factors for uranium or plutonium, and its performance was tested in atmosphere as well as in vacuum. The optimum conditions for the synthesis of silica based micro-particles were obtained for mass production. The optimum ion exchange resin was selected and the optimum conditions for the uranium adsorption in resin bead technique were established for the development of the enrichment factor for nuclear particles in swipe. The established technique was applied to the swipe taken directly from the nuclear facility and also to the archive samples of IAEA's environmental swipes. The evaluation of dose rate of neutron and secondary gamma-ray for the radiation shields were carried out to design the NIPS system, as well as the evaluation of the thermal neutron concentration effect by the various reflectors. D-D neutron generator was introduced as a neutron source for the NIPS system to have more advantages such as easier control and moderation capability than the {sup 252}Cf source. Simulated samples for explosive and chemical warfare were prepared to construct a prompt gamma-ray database. Based on the constructed database, a computer program for the detection of illicit chemical and nuclear materials was developed using the MATLAB software.

  19. Analytical tools for monitoring and control of fermentation processes

    OpenAIRE

    Sundström, Heléne

    2007-01-01

    The overall objective of this work has been to adopt new developments and techniques in the area of measurement, modelling and control of fermentation processes. Flow cytometry and software sensors are techniques which were considered ready for application and the focus was set on developing tools for research aiming at understanding the relationship between measured variables and process quality parameters. In this study fed-batch cultivations have been performed with two different strains o...

  20. Android development tools for Eclipse

    CERN Document Server

    Shah, Sanjay

    2013-01-01

    A standard tutorial aimed at developing Android applications in a practical manner.Android Development Tools for Eclipse is aimed at beginners and existing developers who want to learn more about Android development. It is assumed that you have experience in Java programming and that you have used IDE for development.

  1. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  2. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn M.; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  3. Turbine Aerodynamics Design Tool Development

    Science.gov (United States)

    Huber, Frank W.; Turner, James E. (Technical Monitor)

    2001-01-01

    This paper presents the Marshal Space Flight Center Fluids Workshop on Turbine Aerodynamic design tool development. The topics include: (1) Meanline Design/Off-design Analysis; and (2) Airfoil Contour Generation and Analysis. This paper is in viewgraph form.

  4. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    Science.gov (United States)

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  5. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  6. Foresight, Competitive Intelligence and Business AnalyticsTools for Making Industrial Programmes More Efficient

    OpenAIRE

    Jonathan, Calof; Gregory, Richards; Jack, Smith

    2015-01-01

    Creating industrial programmes, especially in technology, is fraught with high levels of uncertainty. These programmes target the development of products that will not be sold for several years; therefore, one of the risks is that the products will no longer be in demand due to the emergence of more advanced technologies. The paper proposes an integrated approach involving the complementary functions of foresight, intelligence and business analytics. The tools of foresight and intelligence ar...

  7. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  8. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  9. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  10. Urban Development Tools in Denmark

    DEFF Research Database (Denmark)

    Aunsborg, Christian; Enemark, Stig; Sørensen, Michael Tophøj

    2005-01-01

    Artiklen indeholder følgende afsnit: 1. Urbax and the Danish Planning system 2. Main Challenges in the Urban Development 3. Coordination and Growth (Management) Policies and Spatial Planning Policies 4. Coordination of Market Events and Spatial Planning 5. The application of Urban Development Tools...

  11. Developing ICALL Tools Using GATE

    Science.gov (United States)

    Wood, Peter

    2008-01-01

    This article discusses the use of the General Architecture for Text Engineering (GATE) as a tool for the development of ICALL and NLP applications. It outlines a paradigm shift in software development, which is mainly influenced by projects such as the Free Software Foundation. It looks at standards that have been proposed to facilitate the…

  12. Aspects of recent developments in analytical chemometrics

    Institute of Scientific and Technical Information of China (English)

    LIANG; Yizeng; WU; Hailong; SHEN; Guoli; JIANG; Jianhui; LIANG; Sheng

    2006-01-01

    Some aspects of recent developments in analytical chemometrics are discussed, in particular the developments viewed from the angle of the research efforts undertaken in authors' laboratories. The topics concerned include resolution of high-order chemical data, morphological theory and methodology for chemical signal processing, multivariate calibration and chemical pattern recognition for solving complex chemical problems, and resolution of two-way chemical data from hyphenated chromatographic instruments.

  13. Developing variations: : An analytical and historical perspective

    OpenAIRE

    Sirman, Berk

    2006-01-01

    ABSTRACT Berk Sirman: Developing Variations – An Analytical and Historical Perspective. Uppsala Universitet: Institutionen för musikvetenskap, uppsats för 60 p., 2006. Developing variations is a term by Arnold Schönberg that is coined to describe constant modification of motives and ideas in a theme, or possibly throughout the whole work. This is thought to be superior to exact repetitions. Developing variations was used by Schönberg to analyze the music of Brahms, whose compositions represen...

  14. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  15. Employability Skills Assessment Tool Development

    Science.gov (United States)

    Rasul, Mohamad Sattar; Rauf, Rose Amnah Abd; Mansor, Azlin Norhaini; Puvanasvaran, A. P.

    2012-01-01

    Research nationally and internationally found that technical graduates are lacking in employability skills. As employability skills are crucial in outcome-based education, the main goal of this research is to develop an Employability Skill Assessment Tool to help students and lecturers produce competent graduates in employability skills needed by…

  16. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  17. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna;

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size, ...

  18. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  19. A results-based process for evaluation of diverse visual analytics tools

    Science.gov (United States)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  20. Revolutions in Neuroscience: Tool Development

    Directory of Open Access Journals (Sweden)

    John eBickle

    2016-03-01

    Full Text Available Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this paper is that, concerning actual revolutions in neuroscience over the past sixty years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of higher cognitive functions; and the more recent impact of optogenetics (and DREADDs. The two key metascientific concepts I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of Tool Development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions.

  1. Revolutions in Neuroscience: Tool Development

    Science.gov (United States)

    Bickle, John

    2016-01-01

    Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this article is that, concerning actual revolutions in neuroscience over the past 60 years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of “higher” cognitive functions; and the more recent impact of optogenetics and designer receptors exclusively activated by designer drugs (DREADDs). The two key metascientific concepts, I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of tool development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions. PMID:27013992

  2. DEVELOPING NEW TOOLS FOR POLICY ANALYSIS

    International Nuclear Information System (INIS)

    For the past three years, the Office of Security Policy has been aggressively pursuing substantial improvements in the U. S. Department of Energy (DOE) regulations and directives related to safeguards and security (S and S). An initial effort focused on areas where specific improvements could be made. This revision was completed during 2009 with the publication of a number of revised manuals. Developing these revisions involved more than 100 experts in the various disciplines involved, yet the changes made were only those that could be identified and agreed upon based largely on expert opinion. The next phase of changes will be more analytically based. A thorough review of the entire (S and S) directives set will be conducted using software tools to analyze the present directives with a view toward (1) identifying areas of positive synergism among topical areas, (2) identifying areas of unnecessary duplication within and among topical areas, and (3) identifying requirements that are less than effective in achieving the intended protection goals. This paper will describe the software tools available and in development that will be used in this effort. Some examples of the output of the tools will be included, as will a short discussion of the follow-on analysis that will be performed when these outputs are available to policy analysts.

  3. Analytic Tools for Evaluating Variability of Standard Errors in Large-Scale Establishment Surveys

    Directory of Open Access Journals (Sweden)

    Cho MoonJung

    2014-12-01

    Full Text Available Large-scale establishment surveys often exhibit substantial temporal or cross-sectional variability in their published standard errors. This article uses a framework defined by survey generalized variance functions to develop three sets of analytic tools for the evaluation of these patterns of variability. These tools are for (1 identification of predictor variables that explain some of the observed temporal and cross-sectional variability in published standard errors; (2 evaluation of the proportion of variability attributable to the abovementioned predictors, equation error and estimation error, respectively; and (3 comparison of equation error variances across groups defined by observable predictor variables. The primary ideas are motivated and illustrated by an application to the U.S. Current Employment Statistics program.

  4. Application of quantum dots as analytical tools in automated chemical analysis: A review

    Energy Technology Data Exchange (ETDEWEB)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, Joao A.C.; Prior, Joao A.V.; Marques, Karine L. [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer Review on quantum dots application in automated chemical analysis. Black-Right-Pointing-Pointer Automation by using flow-based techniques. Black-Right-Pointing-Pointer Quantum dots in liquid chromatography and capillary electrophoresis. Black-Right-Pointing-Pointer Detection by fluorescence and chemiluminescence. Black-Right-Pointing-Pointer Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  5. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  6. Social media analytics: a survey of techniques, tools and platforms

    OpenAIRE

    Batrinca, B.; Treleaven, P. C.

    2015-01-01

    This paper is written for (social science) researchers seeking to analyze the wealth of social media now available. It presents a comprehensive review of software tools for social networking media, wikis, really simple syndication feeds, blogs, newsgroups, chat and news feeds. For completeness, it also includes introductions to social media scraping, storage, data cleaning and sentiment analysis. Although principally a review, the paper also provides a methodology and a critique of social med...

  7. Information and Analytic Maintenance of Nanoindustry Development

    Directory of Open Access Journals (Sweden)

    Glushchenko Aleksandra Vasilyevna

    2015-05-01

    Full Text Available The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters of nanotech industry development. On the basis of the conducted research the need of information system of nanotech industry development is proved. The information interrelations of subjects of nanotech industry for development of communicative function of the account which becomes dominating in comparison with control function are revealed. The information needs of users of financial and non-financial information are defined. The stages of its introduction, since determination of character, volume, the list and degree of efficiency of information before creation of system of the administrative reporting, the analysis and control are in detail registered. The information and analytical system is focused on the general assessment of efficiency and the major economic indicators, the general tendencies of development of nanotech industry, possible reserves of increasing the efficiency of their functioning. The authors develop pthe system of the indicators characterizing the advancement of nanotech industry and allowing to estimate innovative activity in the sphere of nanotech industry, to calculate intensity of nano-innovations costs, to define the productivity and efficiency of nanotech industry in branch, the region, national economy in general.

  8. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft;

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  9. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    Directory of Open Access Journals (Sweden)

    Shane Dawson

    2014-09-01

    Full Text Available The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006 and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.

  10. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  11. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  12. An analytic model for tool trajectory error in 5-axis machining

    Directory of Open Access Journals (Sweden)

    B.S. So

    2008-12-01

    Full Text Available Purpose: This paper proposes an analytical method of evaluating the maximum error by modeling the exact toolpath when the tool traverses singular region in five-axis machining.Design/methodology/approach: It is known that the Numerical Control (NC data obtained from the inversekinematic transformation can generate singular positions, which have incoherent movements on the rotary axes.Such movements cause unexpected errors and abrupt operations, resulting in scoring on the machined surface.To resolve this problem, previous methods have calculated several tool positions during a singular operation,using inverse kinematic equations to predict tool trajectory and approximate the maximum error. This type ofnumerical approach, configuring the tool trajectory, requires a lot of computational time to obtain a sufficientnumber of tool positions in the singular region. We have derived an analytical equation for the tool trajectoryin the singular area by modeling the tool operation, by considering linear and nonlinear parts that are a generalform of the tool trajectory in the singular area and that are suitable for all types of five-axis machine tools. Inaddition, evaluation of the maximum tool-path error shows high accuracy, using our analytical model.Findings: : In this study, we have separated the linear components of the tool trajectory from the nonlinear ones,to propose a tool trajectory model that is applicable to any kind of 5-axis machine. We have also proposed amethod to calculate the maximum deviation error based on the proposed tool trajectory model.Practical implications: The algorithms proposed in this work can be used for evaluating NC data and forlinearization of NC data with singularity.Originality/value: Our algorithm can be used to modify NC data, making the operation smoother and reducingany errors within tolerance.

  13. Monitoring automotive oil degradation: analytical tools and onboard sensing technologies.

    Science.gov (United States)

    Mujahid, Adnan; Dickert, Franz L

    2012-09-01

    Engine oil experiences a number of thermal and oxidative phases that yield acidic products in the matrix consequently leading to degradation of the base oil. Generally, oil oxidation is a complex process and difficult to elucidate; however, the degradation pathways can be defined for almost every type of oil because they mainly depend on the mechanical status and operating conditions. The exact time of oil change is nonetheless difficult to predict, but it is of great interest from an economic and ecological point of view. In order to make a quick and accurate decision about oil changes, onboard assessment of oil quality is highly desirable. For this purpose, a variety of physical and chemical sensors have been proposed along with spectroscopic strategies. We present a critical review of all these approaches and of recent developments to analyze the exact lifetime of automotive engine oil. Apart from their potential for degradation monitoring, their limitations and future perspectives have also been investigated.

  14. Monitoring automotive oil degradation: analytical tools and onboard sensing technologies.

    Science.gov (United States)

    Mujahid, Adnan; Dickert, Franz L

    2012-09-01

    Engine oil experiences a number of thermal and oxidative phases that yield acidic products in the matrix consequently leading to degradation of the base oil. Generally, oil oxidation is a complex process and difficult to elucidate; however, the degradation pathways can be defined for almost every type of oil because they mainly depend on the mechanical status and operating conditions. The exact time of oil change is nonetheless difficult to predict, but it is of great interest from an economic and ecological point of view. In order to make a quick and accurate decision about oil changes, onboard assessment of oil quality is highly desirable. For this purpose, a variety of physical and chemical sensors have been proposed along with spectroscopic strategies. We present a critical review of all these approaches and of recent developments to analyze the exact lifetime of automotive engine oil. Apart from their potential for degradation monitoring, their limitations and future perspectives have also been investigated. PMID:22752447

  15. The RESET tephra database and associated analytical tools

    Science.gov (United States)

    Bronk Ramsey, Christopher; Housley, Rupert A.; Lane, Christine S.; Smith, Victoria C.; Pollard, A. Mark

    2015-06-01

    An open-access database has been set up to support the research project studying the 'Response of Humans to Abrupt Environmental Transitions' (RESET). The main methodology underlying this project was to use tephra layers to tie together and synchronise the chronologies of stratigraphic records at archaeological and environmental sites. The database has information on occurrences, and chemical compositions, of glass shards from tephra and cryptotephra deposits found across Europe. The data includes both information from the RESET project itself and from the published literature. With over 12,000 major element analyses and over 3000 trace element analyses on glass shards, relevant to 80 late Quaternary eruptions, the RESET project has generated an important archive of data. When added to the published information, the database described here has a total of more than 22,000 major element analyses and nearly 4000 trace element analyses on glass from over 240 eruptions. In addition to the database and its associated data, new methods of data analysis for assessing correlations have been developed as part of the project. In particular an approach using multi-dimensional kernel density estimates to evaluate the likelihood of tephra compositions matching is described here and tested on data generated as part of the RESET project.

  16. Status of immunoassay as an analytical tool in environmental investigations

    International Nuclear Information System (INIS)

    Immunoassay methods were initially applied in clinical situations where their sensitivity and selectivity were utilized for diagnostic purposes. In the 1970s, pesticide chemists realized the potential benefits of immunoassay methods for compounds difficult to analyze by gas chromatography. This transition of the technology has extended to the analysis of soil, water, food and other matrices of environmental and human exposure significance particularly for compounds difficult to analyze by chromatographic methods. The utility of radioimmunoassays and enzyme immunoassays for environmental investigations was recognized in the 1980s by the U.S. Environmental Protection Agency (U.S. EPA) with the initiation of an immunoassay development programme. The U.S. Department of Agriculture (USDA) and the U.S. Food and Drug Administration (PDA) have investigated immunoassays for the detection of residues in food both from an inspection and a contamination prevention perspective. Environmental immunoassays are providing rapid screening information as well as quantitative information to fulfill rigorous data quality objectives for monitoring programmes

  17. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  18. Developing a Social Media Marketing tool

    OpenAIRE

    Valova, Olga

    2015-01-01

    The objective of the thesis is to develop a better, easier to use social media marketing tool that could be utilised in any business. By understanding and analysing how business uses social media as well as currently available social media marketing tools, design a tool with the maximum amount of features, but with a simple and intuitive User Interface. An agile software development life cycle was used throughout the creation of the tool. Qualitative analysis was used to analyse existing ...

  19. Complex reconfiguration - developing common tools

    International Nuclear Information System (INIS)

    Reconfiguring DOE sites, facilities, and laboratories to meet expected and evolving missions involves a number of disciplines and approaches formerly the preserve of private industry and defense contractors. This paper considers the process of identifying common tools for the various disciplines that can be exercised, assessed, and applied by team members to arrive at integrated solutions. The basic tools include: systems, hardware, software, and procedures that can characterize a site/facility's environment to meet organizational goals, safeguards and security, ES ampersand H, and waste requirements. Other tools such as computer-driven inventory and auditing programs can provide traceability of materials and product as they are processed and required added protection and control. This paper will also discuss the use of integrated teams in a number of high technology enterprises that could be adopted by DOE in high profile programs from environmental remediation to weapons dismantling and arms control

  20. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    Science.gov (United States)

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  1. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    Science.gov (United States)

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  2. The Evolution of Analytical Hierarchy Process (AHP) as a Decision Making Tool in Property Sectors

    OpenAIRE

    Mohd Safian, Edie Ezwan; Nawawi, Abdul Hadi

    2011-01-01

    In the 1970s, Analytical Hierarchy Process (AHP)has been introduced accidentally by Saaty [4] as a tool to allocate resources and planning needs for the military. However, due to its ability to identify the weightage of variables efficiently in research, it has become popular in many sectors. Basically, AHP is a tool in decision making that arranges the variables into a hierarchical form in order to rank the importance of each variable. Leading to the weightage calculation of the variables in...

  3. MASCOT: Multi-Criteria Analytical SCOring Tool for ArcGIS Desktop

    OpenAIRE

    Pierre Lacroix; Helder Santiago; Nicolas Ray

    2014-01-01

    Multicriteria Analytical SCOring Tool (MASCOT) is a decision-support tool based on spatial analysis that can score items (points, lines, and polygons) as a function of their Euclidian distance to other data (points, lines, polygons, rasters). MASCOT is integrated with ArcGIS 9.3.1 and makes it possible to achieve a complete workflow that may include data preparation and grouping of factors by theme, weighting, scoring, post-processing and decision-making. To achieve the weighting process, the...

  4. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  5. Productive re-use of CSCL data and analytic tools to provide a new perspective on group cohesion

    OpenAIRE

    Reffay, Christophe; Teplovs, Christopher; Blondel, François-Marie

    2011-01-01

    The goals of this paper are twofold: (1) to demonstrate how previously published data can be re-analyzed to gain a new perspective on CSCL dynamics and (2) to propose a new measure of social cohesion that was developed through improvements to existing analytic tools. In this study, we downloaded the Simuligne corpus from the publicly available Mulce repository. We improved the Knowledge Space Visualizer (KSV) to deepen the notion of cohesion by using a dynamic representation of sociograms. Th...

  6. The Comparison of Learning Analytics Tools%学习分析工具比较研究

    Institute of Scientific and Technical Information of China (English)

    孟玲玲; 顾小清; 李泽

    2014-01-01

    In recent years, with the rapid development of a smart learning environment, massive, rich, diverse, and heterogeneous data are increasing amazingly. In education field, students’ interests, preferences, activities, learning process information, such as the interaction with learning platform, as well as their implicit feedback to the e-learning platform, can all be recorded and traced. How to effectively make use of these data has drawn great concern. The da-ta of a single person seems to be chaotic, but with the data accumulating to a certain extent, it will be presented in an order. There are strong or weak relations among the data. For example, what are the characteristics of students in dif-ferent region or countries? What are the characteristics of learning behavior in different ages? What are the learning habits of different students? Which courses are needed urgently for a successful career? For a special course, which u-nits are needed for review? Which units are needed to be emphasized? Which students encounter difficulties and need help? Therefore there are amazing insights behind the data. If we extract the rules or determine the relationships among data, tremendous value will be created. Therefore, learning analysis techniques arise. According to the Horizon Report 2011 in the New Media Consortiums Horizon Project, learning analytics technol-ogy will become a hot topic in the next few years. It will contribute to improving the learning process and make the learning more intelligent. As we can imagine, the analytics tools play an important role in the process of learning an-alytics. Good tools can make the research process more effective. Many analytics tools have been developed. For example, Nvivo, Atlas. ti can be used to annotate the text and multimedia content. Gephi, JUNG, Guess can be used to analyze learning networks, and SPSS can analyze user data statistics. However, a key issue is how to choose the appropriate tool because different tools

  7. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    Science.gov (United States)

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  8. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    Science.gov (United States)

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  9. Capitalizing on App Development Tools and Technologies

    Science.gov (United States)

    Luterbach, Kenneth J.; Hubbell, Kenneth R.

    2015-01-01

    Instructional developers and others creating apps must choose from a wide variety of app development tools and technologies. Some app development tools have incorporated visual programming features, which enable some drag and drop coding and contextual programming. While those features help novices begin programming with greater ease, questions…

  10. Analytical Investigation and Comparison on Performance of Ss316, Ss440c and Titanium Alloy Tool Materials Used As Single Point Cutting Tool

    Directory of Open Access Journals (Sweden)

    Mr. Amaresh Kumar Dhadange

    2015-08-01

    Full Text Available Theoretical analysis for performance studies of SS316, SS44OC and Titanium Alloy used as a cutting tool is presented in this paper. Tool temperature, tools wear and life of the tool is investigated analytically. These theoretical values are compared with the experimental studies conducted by the author. The values obtained from experimental studies are comparable with analytical values and variation is the correlation between theoretical and experimental values is of the order of 15%.

  11. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  12. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  13. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  14. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  15. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  16. Analytical Quality by Design Approach to Test Method Development and Validation in Drug Substance Manufacturing

    Directory of Open Access Journals (Sweden)

    N. V. V. S. S. Raman

    2015-01-01

    Full Text Available Pharmaceutical industry has been emerging rapidly for the last decade by focusing on product Quality, Safety, and Efficacy. Pharmaceutical firms increased the number of product development by using scientific tools such as QbD (Quality by Design and PAT (Process Analytical Technology. ICH guidelines Q8 to Q11 have discussed QbD implementation in API synthetic process and formulation development. ICH Q11 guidelines clearly discussed QbD approach for API synthesis with examples. Generic companies are implementing QbD approach in formulation development and even it is mandatory for USFDA perspective. As of now there is no specific requirements for AQbD (Analytical Quality by Design and PAT in analytical development from all regulatory agencies. In this review, authors have discussed the implementation of QbD and AQbD simultaneously for API synthetic process and analytical methods development. AQbD key tools are identification of ATP (Analytical Target Profile, CQA (Critical Quality Attributes with risk assessment, Method Optimization and Development with DoE, MODR (method operable design region, Control Strategy, AQbD Method Validation, and Continuous Method Monitoring (CMM. Simultaneous implementation of QbD activities in synthetic and analytical development will provide the highest quality product by minimizing the risks and even it is very good input for PAT approach.

  17. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  18. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-17

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  19. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-17

    Monte Carlo calculations using one-group crosssections, multigroup cross-sections, or simple continuous energy cross-sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  20. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  1. 40 CFR 766.16 - Developing the analytical test method.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Developing the analytical test method. 766.16 Section 766.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC... analytical test method. Because of the matrix differences of the chemicals listed for testing, no one...

  2. Evaluation of angiotensin II receptor blockers for drug formulary using objective scoring analytical tool

    Directory of Open Access Journals (Sweden)

    Lim TM

    2012-09-01

    Full Text Available Drug selection methods with scores have been developed and used worldwide for formulary purposes. These tools focus on the way in which the products are differentiated from each other within the same therapeutic class. Scoring Analytical Tool (SAT is designed based on the same principle with score and is able to assist formulary committee members in evaluating drugs either to add or delete in a more structured, consistent and reproducible manner. Objective: To develop an objective SAT to facilitate evaluation of drug selection for formulary listing purposes. Methods: A cross-sectional survey was carried out. The proposed SAT was developed to evaluate the drugs according to pre-set criteria and sub-criteria that were matched to the diseases concerned and scores were then assigned based on their relative importance. The main criteria under consideration were safety, quality, cost and efficacy. All these were converted to questionnaires format. Data and information were collected through self-administered questionnaires that were distributed to medical doctors and specialists from the established public hospitals. A convenient sample of 167 doctors (specialists and non-specialists were taken from various disciplines in the outpatient clinics such as Medical, Nephrology and Cardiology units who prescribed ARBs hypertensive drugs to patients. They were given a duration of 4 weeks to answer the questionnaires at their convenience. One way ANOVA, Kruskal Wallis and post hoc comparison tests were carried out at alpha level 0.05. Results: Statistical analysis showed that the descending order of ARBs preference was Telmisartan or Irbesartan or Losartan, Valsartan or Candesartan, Olmesartan and lastly Eprosartan. The most cost saving ARBs for hypertension in public hospitals was Irbesartan. Conclusion: SAT is a tool which can be used to reduce the number of drugs and retained the most therapeutically appropriate drugs in the formulary, to determine most

  3. Software Development Management: Empirical and Analytical Perspectives

    Science.gov (United States)

    Kang, Keumseok

    2011-01-01

    Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…

  4. Technology Roadmaps: Tools for Development

    OpenAIRE

    Anthony Clayton

    2008-01-01

    The paper opens a series of two publications devoted to technological roadmapping. This technique allows revealing and relating threats, risks, priorities and opportunities of development for different technologies and thus making better decisions. Different factors influencing on road mapping are considered such as base technologies and possible alternatives, potential gaps and risks, competitiveness etc. Special attention is paid to emerging technologies having great potential, when expecte...

  5. An integrated analytic tool and knowledge-based system approach to aerospace electric power system control

    Science.gov (United States)

    Owens, William R.; Henderson, Eric; Gandikota, Kapal

    1986-10-01

    Future aerospace electric power systems require new control methods because of increasing power system complexity, demands for power system management, greater system size and heightened reliability requirements. To meet these requirements, a combination of electric power system analytic tools and knowledge-based systems is proposed. The continual improvement in microelectronic performance has made it possible to envision the application of sophisticated electric power system analysis tools to aerospace vehicles. These tools have been successfully used in the measurement and control of large terrestrial electric power systems. Among these tools is state estimation which has three main benefits. The estimator builds a reliable database for the system structure and states. Security assessment and contingency evaluation also require a state estimator. Finally, the estimator will, combined with modern control theory, improve power system control and stability. Bad data detection as an adjunct to state estimation identifies defective sensors and communications channels. Validated data from the analytic tools is supplied to a number of knowledge-based systems. These systems will be responsible for the control, protection, and optimization of the electric power system.

  6. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  7. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  8. Information and Analytic Maintenance of Nanoindustry Development

    OpenAIRE

    Glushchenko Aleksandra Vasilyevna; Bukhantsev Yuriy Alekseevich; Khudyakova Anna Sergeevna

    2015-01-01

    The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters o...

  9. Online Analytical Processing (OLAP): A Fast and Effective Data Mining Tool for Gene Expression Databases

    OpenAIRE

    Nadim W. Alkharouf; D. Curtis Jamison; Benjamin F. Matthews

    2005-01-01

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to...

  10. User Research and Game Analytics as a Combined Tool of Business Intelligence in Mobile Game Industry

    OpenAIRE

    Büyükcan, Elif

    2014-01-01

    User studies are important sources of information for companies as they help the companies to understand the customers better when reaching them in many efficient ways. On the other hand, game analytics provide a good understanding about user behavior by making use of in-game statistics. Since both tools set up a good but different knowledge base, game companies can benefit from the combination of these two related information sources. Benefits which can be achieved from this cover gaining bu...

  11. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  12. An analytic solution to LO coupled DGLAP evolution equations: a new pQCD tool

    CERN Document Server

    Block, Martin M; Ha, Phuoc; McKay, Douglas W

    2010-01-01

    We have analytically solved the LO pQCD singlet DGLAP equations using Laplace transform techniques. Newly-developed highly accurate numerical inverse Laplace transform algorithms allow us to write fully decoupled solutions for the singlet structure function F_s(x,Q^2)and G(x,Q^2) as F_s(x,Q^2)={\\cal F}_s(F_{s0}(x), G_0(x)) and G(x,Q^2)={\\cal G}(F_{s0}(x), G_0(x)). Here {\\cal F}_s and \\cal G are known functions of the initial boundary conditions F_{s0}(x) = F_s(x,Q_0^2) and G_{0}(x) = G(x,Q_0^2), i.e., the chosen starting functions at the virtuality Q_0^2. For both G and F_s, we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy, a computational fractional precision of O(10^{-9}). Armed with this powerful new tool in the pQCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F_s distributions, starting from their initial values at Q_0^2=1 GeV^2 and 1.69 GeV^2, respectively, using their ...

  13. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  14. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  15. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  16. Development of health-related analytical techniques

    International Nuclear Information System (INIS)

    Related to the programme based on Nuclear Methods for Health-Related Monitoring of Trace Element Pollutants in Man initiated by I.A.E.A. in 1978, our laboratory (L.A.R.N.) has developed and optimized samples preparation techniques for hair analysis without dissolution and preconcentration or chemical separation, and PIXE non vacuum technique for measurement of biological samples such as liquids. The results obtained at L.A.R.N. have been compared with results obtained by the other techniques, and it is found that PIXE can be used with reliability to analyse in a short time a lot of biological samples. (author)

  17. Recent analytical developments for powder characterization

    Science.gov (United States)

    Brackx, E.; Pages, S.; Dugne, O.; Podor, R.

    2015-07-01

    Powders and divided solid materials are widely represented as finished or intermediary products in industries as widely varied as foodstuffs, cosmetics, construction, pharmaceuticals, electronic transmission, and energy. Their optimal use requires a mastery of the transformation process based on knowledge of the different phenomena concerned (sintering, chemical reactivity, purity, etc.). Their modelling and understanding need a prior acquisition of sets of data and characteristics which are more or less challenging to obtain. The goal of this study is to present the use of different physico-chemical characterization techniques adapted to uranium-containing powders analyzed either in a raw state or after a specific preparation (ionic polishing). The new developments touched on concern dimensional characterization techniques for grains and pores by image analysis, chemical surface characterization and powder chemical reactivity characterization. The examples discussed are from fabrication process materials used in the nuclear fuel cycle.

  18. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    Science.gov (United States)

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results. PMID:27251852

  19. Remote tool development for nuclear dismantling operations

    International Nuclear Information System (INIS)

    Remote tool systems to undertake nuclear dismantling operations require careful design and development not only to perform their given duty but to perform it safely within the constraints imposed by harsh environmental conditions. Framatome ANP NUCLEAR SERVICES has for a long time developed and qualified equipment to undertake specific maintenance operations of nuclear reactors. The tool development methodology from this activity has since been adapted to resolve some very challenging reactor dismantling operations which are demonstrated in this paper. Each nuclear decommissioning project is a unique case, technical characterisation data is generally incomplete. The development of the dismantling methodology and associated equipment is by and large an iterative process combining design and simulation with feasibility and validation testing. The first stage of the development process involves feasibility testing of industrial tools and examining adaptations necessary to control and deploy the tool remotely with respect to the chosen methodology and environmental constraints. This results in a prototype tool and deployment system to validate the basic process. The second stage involves detailed design which integrates any remaining technical and environmental constraints. At the end of this stage, tools and deployment systems, operators and operating procedures are qualified on full scale mock ups. (authors)

  20. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  1. Forty years of development in diamond tools

    Science.gov (United States)

    The growth of the diamond industry in Western Countries since the First World War is surveyed. The articles described deal specifically with the development of the industrial diamond and diamond tool sector in different countries. All data point to continuing rapid expansion in the diamond tool sector. The West consumes 80 percent of world industrial diamond production. Diamond consumption increased sharply in the U.S. during World War 2. There are 300 diamond manufacturers in the U.S. today. In 1940, there were 25. In Japan, consumption of industrial diamonds has increased several times. In Italy, there has been a 75 fold increase in the production of diamond tools since 1959.

  2. ICT Tools and Students' Competence Development

    Science.gov (United States)

    Fuglestad, Anne Berit

    2004-01-01

    In this paper I will present the rationale that motivates the study in an ongoing three-year project following students in school years 8 to 10. The aim is to develop the students' competence with use of ICT tools in mathematics in such a way that they will be able to choose tools for themselves, not rely just on the teacher telling them what to…

  3. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    Science.gov (United States)

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  4. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products. PMID:27394712

  5. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  6. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    International Nuclear Information System (INIS)

    Analytical and Process Chemistry (A ampersand PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A ampersand PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A ampersand PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A ampersand PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in

  7. From Mini to Micro Scale—Feasibility of Raman Spectroscopy as a Process Analytical Tool (PAT

    Directory of Open Access Journals (Sweden)

    Peter Kleinebudde

    2011-10-01

    Full Text Available Background: Active coating is an important unit operation in the pharmaceutical industry. The quality, stability, safety and performance of the final product largely depend on the amount and uniformity of coating applied. Active coating is challenging regarding the total amount of coating and its uniformity. Consequently, there is a strong demand for tools, which are able to monitor and determine the endpoint of a coating operation. In previous work, it was shown that Raman spectroscopy is an appropriate process analytical tool (PAT to monitor an active spray coating process in a pan coater [1]. Using a multivariate model (Partial Least Squares—PLS the Raman spectral data could be correlated with the coated amount of the API diprophylline. While the multivariate model was shown to be valid for the process in a mini scale pan coater (batch size: 3.5 kg cores, the aim of the present work was to prove the robustness of the model by transferring the results to tablets coated in a micro scale pan coater (0.5 kg. Method: Coating experiments were performed in both, a mini scale and a micro scale pan coater. The model drug diprophylline was coated on placebo tablets. The multivariate model, established for the process in the mini scale pan coater, was applied to the Raman measurements of tablets coated in the micro scale coater for six different coating levels. Then, the amount of coating, which was predicted by the model, was compared with reference measurements using UV spectroscopy. Results: For all six coating levels the predicted coating amount was equal to the amounts obtained by UV spectroscopy within the statistical error. Thus, it was possible to predict the total coating amount with an error smaller than 3.6%. The root mean squares of errors for calibration and prediction (root mean square of errors for calibration and prediction—RMSEC and RMSEP were 0.335 mg and 0.392 mg, respectively, which means that the predictive power of the model

  8. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    DEFF Research Database (Denmark)

    Vedel, Søren; Olesen, Laurits Højgaard; Bruus, Henrik

    2010-01-01

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we...... suggest the use of system-level equivalent circuit theory as an adequate theory of the behavior of the system. A novel pressure source capable of operation in the desired frequency range is presented for this generic analysis. As a proof of concept, we study the fairly complex system of water...

  9. The Information Needs of the Developing Countries: Analytical Case Studies.

    Science.gov (United States)

    Salman, Lamia

    1981-01-01

    Presents the generalized conclusions from analytical case studies undertaken by UNESCO and the United Nations Interim Fund for Science and Technology for Development (IFSTD) on the needs and options for access to scientific and technical information in eight developing countries. (Author/JL)

  10. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    Energy Technology Data Exchange (ETDEWEB)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  11. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  12. Analytical Ultracentrifugation as a Tool to Study Nonspecific Protein–DNA Interactions

    Science.gov (United States)

    Yang, Teng-Chieh; Catalano, Carlos Enrique; Maluf, Nasib Karl

    2016-01-01

    Analytical ultracentrifugation (AUC) is a powerful tool that can provide thermodynamic information on associating systems. Here, we discuss how to use the two fundamental AUC applications, sedimentation velocity (SV), and sedimentation equilibrium (SE), to study nonspecific protein–nucleic acid interactions, with a special emphasis on how to analyze the experimental data to extract thermodynamic information. We discuss three specific applications of this approach: (i) determination of nonspecific binding stoichiometry of E. coli integration host factor protein to dsDNA, (ii) characterization of nonspecific binding properties of Adenoviral IVa2 protein to dsDNA using SE-AUC, and (iii) analysis of the competition between specific and nonspecific DNA-binding interactions observed for E. coli integration host factor protein assembly on dsDNA. These approaches provide powerful tools that allow thermodynamic interrogation and thus a mechanistic understanding of how proteins bind nucleic acids by both specific and nonspecific interactions. PMID:26412658

  13. Analytical continuation in physical geodesy constructed by means of tools and formulas related to an ellipsoid of revolution

    Science.gov (United States)

    Holota, Petr; Nesvadba, Otakar

    2014-05-01

    In physical geodesy mathematical tools applied for solving problems of potential theory are often essentially associated with the concept of the so-called spherical approximation (interpreted as a mapping). The same holds true for the method of analytical (harmonic) continuation which is frequently considered as a means suitable for converting the ground gravity anomalies or disturbances to corresponding values on the level surface that is close to the original boundary. In the development and implementation of this technique the key role has the representation of a harmonic function by means of the famous Poisson's formula and the construction of a radial derivative operator on the basis of this formula. In this contribution an attempt is made to avoid spherical approximation mentioned above and to develop mathematical tools that allow implementation of the concept of analytical continuation also in a more general case, in particular for converting the ground gravity anomalies or disturbances to corresponding values on the surface of an oblate ellipsoid of revolution. The respective integral kernels are constructed with the aid of series of ellipsoidal harmonics and their summation, but also the mathematical nature of the boundary date is discussed in more details.

  14. Recent developments in analytical toxicology : for better or for worse

    NARCIS (Netherlands)

    de Zeeuw, RA

    1998-01-01

    When considering the state of the art in toxicology from an analytical perspective, the key developments relate to three major areas. (1) Forensic horizon: Today forensic analysis has broadened its scope dramatically, to include workplace toxicology, drug abuse testing, drugs and driving, doping, en

  15. H1640 caster tool development report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.A.

    1997-12-01

    This report describes the development and certification of the H1640 caster tool. This tool is used to rotate swivel caster wheels 90 degrees on bomb hand trucks or shipping containers. The B83 is a heavy bomb system and weighs close to 5,600 pounds for a two-high stack configuration. High castering moments (handle length times the force exerted on handle) are required to caster a wheel for a two-high stack of B83s. The H1640 is available to the DoD (Air Force) through the Special Equipment List (SEL) for the B83 as a replacement for the H631 and H1216 caster tools.

  16. Developing Adaptive Elearning: An Authoring Tool Design

    Directory of Open Access Journals (Sweden)

    Said Talhi

    2011-09-01

    Full Text Available Adaptive hypermedia is the answer to the lost in hyperspace syndrome, where the user has normally too many links to choose from, and little knowledge about how to proceed and select the most appropriate ones to him/her. Adaptive hypermedia thus offers a selection of links or content most appropriate to the user. Until very recently, little attention has been given to the complex task of authoring materials for Adaptive Educational Hypermedia. An author faces a multitude of problems when creating a personalized, rich learning experience for each user. The purpose of this paper is to present an authoring tool for adaptive hypermedia based courses. Designed to satisfy guidelines of accessibility of the W3C recommendation for authors and learners that present disabilities, the authoring tool allows several authors geographically dispersed to produce such courses together. It consists of a shared workspace gathering all tools necessary to the cooperative development task.

  17. The role of big data and advanced analytics in drug discovery, development, and commercialization.

    Science.gov (United States)

    Szlezák, N; Evers, M; Wang, J; Pérez, L

    2014-05-01

    In recent years, few ideas have captured the imagination of health-care practitioners as much as the advent of "big data" and the advanced analytical methods and technologies used to interpret it-it is a trend seen as having the potential to revolutionize biology, medicine, and health care.(1,2,3) As new types of data and tools become available, a unique opportunity is emerging for smarter and more effective discovery, development, and commercialization of innovative biopharmaceutical drugs.

  18. Xamarin as a tool for mobile development

    OpenAIRE

    Gridin, Oleksandr

    2015-01-01

    Xamarin as a tool for mobile development was chosen as a topic for the thesis work because of its fast pace of growth. This technology was founded in May 2011 and now counts more 1.25 million developers who have already proven its worth. With help of this technology mobile development process will come to the new qualitative level where crappy software won’t exist anymore. The main goals for this project were to show Xamarin’s power with creating cross-platform mobile application and to p...

  19. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    Science.gov (United States)

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-25

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  20. Analytical tools for investigating strong-field QED processes in tightly focused laser fields

    CERN Document Server

    Di Piazza, A

    2015-01-01

    The present paper is the natural continuation of the letter [Phys. Rev. Lett. \\textbf{113}, 040402 (2014)], where the electron wave functions in the presence of a background electromagnetic field of general space-time structure have been constructed analytically, assuming that the initial energy of the electron is the largest dynamical energy scale in the problem and having in mind the case of a background tightly focused laser beam. Here, we determine the scalar and the spinor propagators under the same approximations, which are useful tools for calculating, e.g., total probabilities of processes occurring in such complex electromagnetic fields. In addition, we also present a simpler and more general expression of the electron wave functions found in [Phys. Rev. Lett. \\textbf{113}, 040402 (2014)] and we indicate a substitution rule to obtain them starting from the well-known Volkov wave functions in a plane-wave field.

  1. Continuous wave free precession Practical analytical tool for low-resolution nuclear magnetic resonance measurements

    International Nuclear Information System (INIS)

    The use of continuous wave free precession (CWFP) as a practical analytical tool for quantitative determinations in low-resolution nuclear magnetic resonance (LRNMR) is examined. The requirements of this technique are shown to be no more demanding than those prevailing in free-induction decay or spin-echo measurements. It is shown that the substantial gain in signal to noise ratio for a given acquisition time permitted by CWFP, can be exploited with advantage in practically any application of LRNMR. This applies not only to homogeneous low viscosity liquid samples but also to multi-component systems where differences in relaxation times of each component permit a separation of the individual contributions. As an example, the use of CWFP for fast quantitative determination of oil and moisture in various seeds is presented

  2. Development of Simulation Tool Orienting Production Engineering

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ning; NING Ru-xin; TANG Cheng-tong; LIANG Fu-jun

    2006-01-01

    A simulation tool named BITSIM orienting production engineering is developed in order to improve enterprise's productivity and making up the scarcity of computer application. The architecture of BITSIM is presented first. Hierarchical technique, control strategy based on multi-agent and simulation output analysis are depicted in detail then. In the end, an application example is taken out to prove that this system could be used to analyzing different hypothetical situation and configuring the auxiliary manufacturing system before production.

  3. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  4. Developments and retrospectives in Lie theory geometric and analytic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph

    2014-01-01

    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Geometric and Analytic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current re...

  5. SE Requirements Development Tool User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Faith Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Department of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.

  6. Constructing Subjects, Producing Subjectivities: Developing Analytic Needs in Discursive Psychology

    OpenAIRE

    McAvoy, Jean

    2007-01-01

    The publication of Potter and Wetherell’s (1987) blueprint for a discursive social psychology was a pivotal moment in the discursive turn in psychology. That transformational text went on to underpin much contemporary discursive psychology; paving the way for what has become an enriching range of analytic approaches, and epistemological and ontological arguments (Wetherell, Taylor and Yates, 2001a; 2001b). Twenty years on, and as discursive psychology continues to develop, t...

  7. Recent developments in detection methods for microfabricated analytical devices.

    Science.gov (United States)

    Schwarz, M A; Hauser, P C

    2001-09-01

    Sensitive detection in microfluidic analytical devices is a challenge because of the extremely small detection volumes available. Considerable efforts have been made lately to further address this aspect and to investigate techniques other than fluorescence. Among the newly introduced techniques are the optical methods of chemiluminescence, refraction and thermooptics, as well as the electrochemical methods of amperometry, conductimetry and potentiometry. Developments are also in progress to create miniaturized plasma-emission spectrometers and sensitive detectors for gas-chromatographic separations.

  8. The use of meta-analytical tools in risk assessment for food safety.

    Science.gov (United States)

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  9. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  10. RDF Analytics: Lenses over Semantic Graphs

    OpenAIRE

    Colazzo, Dario; Goasdoué, François; Manolescu, Ioana; Roatis, Alexandra

    2014-01-01

    The development of Semantic Web (RDF) brings new requirements for data analytics tools and methods, going beyond querying to semantics-rich analytics through warehouse-style tools. In this work, we fully redesign, from the bottom up, core data analytics concepts and tools in the context of RDF data, leading to the first complete formal framework for warehouse-style RDF analytics. Notably, we define i) analytical schemas tailored to heterogeneous, semantics-rich RDF graph, ii) analytical queri...

  11. Optimization technique as a tool for implementing analytical quality by Design

    Directory of Open Access Journals (Sweden)

    C. MOHAN REDDY

    2013-09-01

    Full Text Available A process is well understood when all critical sources of variability are identified and explained, variability is managed by the process, and product quality attributes can be accurately and reliably predicted over the design space. Quality by Design (QbD is a systematic approach to development of products and processes that begins with predefined objectives and emphasizes product and process understanding and process control based on sound science, statistical methods and quality risk management. In an attempt to curb rising development costs and regulatory barriers to innovation and creativity, the FDA and ICH have recently started promoting QbD in the pharmaceutical industry. QbD is partially based on the application of statistical Design of Experiments strategy to the development of both analytical methods and pharmaceutical formulations. The present work describes the development of robust HPLC method for analysis of Eplerenone formulation under QbD approach using Design of Experiments.

  12. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  13. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  14. Modelling the level of adoption of analytical tools; An implementation of multi-criteria evidential reasoning

    Directory of Open Access Journals (Sweden)

    Igor Barahona

    2014-08-01

    Full Text Available In the future, competitive advantages will be given to organisations that can extract valuable information from massive data and make better decisions. In most cases, this data comes from multiple sources. Therefore, the challenge is to aggregate them into a common framework in order to make them meaningful and useful.This paper will first review the most important multi-criteria decision analysis methods (MCDA existing in current literature. We will offer a novel, practical and consistent methodology based on a type of MCDA, to aggregate data from two different sources into a common framework. Two datasets that are different in nature but related to the same topic are aggregated to a common scale by implementing a set of transformation rules. This allows us to generate appropriate evidence for assessing and finally prioritising the level of adoption of analytical tools in four types of companies.A numerical example is provided to clarify the form for implementing this methodology. A six-step process is offered as a guideline to assist engineers, researchers or practitioners interested in replicating this methodology in any situation where there is a need to aggregate and transform multiple source data.

  15. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences.

  16. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    Science.gov (United States)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  17. Developing a Support Tool for Global Product Development Decisions

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Ahmed-Kristensen, Saeema

    2016-01-01

    presents results from 51 decisions made in the three companies, and based on the results of the studies a framework for a decision-support tool is outlined and discussed. The paper rounds off with an identification of future research opportunities in the area of global product development and decision-making....

  18. Development and first application of an operating events ranking tool

    Energy Technology Data Exchange (ETDEWEB)

    Šimić, Zdenko [European Commission Joint Research Centre – Institute for Energy and Transport, Postbus 2, 1755ZG Petten (Netherlands); University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb (Croatia); Zerger, Benoit, E-mail: benoit.zerger@ec.europa.eu [European Commission Joint Research Centre – Institute for Energy and Transport, Postbus 2, 1755ZG Petten (Netherlands); Banov, Reni [University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb (Croatia)

    2015-02-15

    Highlights: • A method using analitycal hierarchy process for ranking operating events is developed and tested. • The method is applied for 5 years of U.S. NRC Licensee Event Reports (1453 events). • Uncertainty and sensitivity of the ranking results are evaluated. • Real events assessment shows potential of the method for operating experience feedback. - Abstract: The operating experience feedback is important for maintaining and improving safety and availability in nuclear power plants. Detailed investigation of all events is challenging since it requires excessive resources, especially in case of large event databases. This paper presents an event groups ranking method to complement the analysis of individual operating events. The basis for the method is the use of an internationally accepted events characterization scheme that allows different ways of events grouping and ranking. The ranking method itself consists of implementing the analytical hierarchy process (AHP) by means of a custom developed tool which allows events ranking based on ranking indexes pre-determined by expert judgment. Following the development phase, the tool was applied to analyze a complete set of 5 years of real nuclear power plants operating events (1453 events). The paper presents the potential of this ranking method to identify possible patterns throughout the event database and therefore to give additional insights into the events as well as to give quantitative input for the prioritization of further more detailed investigation of selected event groups.

  19. Multiple Forces Driving China's Economic Development: A New Analytic Framework

    Institute of Scientific and Technical Information of China (English)

    Yahua Wang; Angang Hu

    2007-01-01

    Based on economic growth theory and the World Bank's analytical framework relating to the quality of growth, the present paper constructs a framework that encompasses physical, international, human, natural and knowledge capital to synthetically interpret economic development. After defining the five types of capital and total capital, we analyze the dynamic changes of these types of capital in China and in other countries. The results show that since China's reform and opening up, knowledge, international, human and physical capital have grown rapidly, with speeds of growth higher than that of economic growth. As the five types of capital have all increased at varying paces, the savings level of total capital in China has quadrupled in 25 years and overtook that of the USA in the 1990s. The changes in the five types of capital and total capital reveal that there are progressively multiple driving forces behind China's rapid economic development. Implications for China's long-term economic development are thereby raised.

  20. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...

  1. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    Science.gov (United States)

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  2. Development of the SOFIA Image Processing Tool

    Science.gov (United States)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  3. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  4. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  5. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  6. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    Science.gov (United States)

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  7. A Survey on Big Data Analytics: Challenges, Open Research Issues and Tools

    Directory of Open Access Journals (Sweden)

    D. P. Acharjya

    2016-02-01

    Full Text Available A huge repository of terabytes of data is generated each day from modern information systems and digital technolo-gies such as Internet of Things and cloud computing. Analysis of these massive data requires a lot of efforts at multiple levels to extract knowledge for decision making. Therefore, big data analysis is a current area of research and development. The basic objective of this paper is to explore the potential impact of big data challenges, open research issues, and various tools associated with it. As a result, this article provides a platform to explore big data at numerous stages. Additionally, it opens a new horizon for researchers to develop the solution, based on the challenges and open research issues.

  8. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing...

  9. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  10. Implementing analytics a blueprint for design, development, and adoption

    CERN Document Server

    Sheikh, Nauman

    2013-01-01

    Implementing Analytics demystifies the concept, technology and application of analytics and breaks its implementation down to repeatable and manageable steps, making it possible for widespread adoption across all functions of an organization. Implementing Analytics simplifies and helps democratize a very specialized discipline to foster business efficiency and innovation without investing in multi-million dollar technology and manpower. A technology agnostic methodology that breaks down complex tasks like model design and tuning and emphasizes business decisions

  11. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  12. Application of quality by design to the development of analytical separation methods.

    Science.gov (United States)

    Orlandini, Serena; Pinzauti, Sergio; Furlanetto, Sandra

    2013-01-01

    Recent pharmaceutical regulatory documents have stressed the critical importance of applying quality by design (QbD) principles for in-depth process understanding to ensure that product quality is built in by design. This article outlines the application of QbD concepts to the development of analytical separation methods, for example chromatography and capillary electrophoresis. QbD tools, for example risk assessment and design of experiments, enable enhanced quality to be integrated into the analytical method, enabling earlier understanding and identification of variables affecting method performance. A QbD guide is described, from identification of quality target product profile to definition of control strategy, emphasizing the main differences from the traditional quality by testing (QbT) approach. The different ways several authors have treated single QbD steps of method development are reviewed and compared. In a final section on outlook, attention is focused on general issues which have arisen from the surveyed literature, and on the need to change the researcher's mindset from the QbT to QbD approach as an important analytical trend for the near future.

  13. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  14. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software. PMID:27627408

  15. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff. METHODOLOGY/PRINCIPAL FINDINGS: A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i capture complexity, ii analyze trends, and iii identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%, heavy metals (10.9% or bacteria (20.3%. Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting as well as within (e.g. transgressing activities. CONCLUSIONS/SIGNIFICANCE: This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and

  16. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  17. Development of a Framework for Sustainable Outsourcing: Analytic Balanced Scorecard Method (A-BSC

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-06-01

    Full Text Available Nowadays, many enterprises choose to outsource its non-core business to other enterprises to reduce cost and increase the efficiency. Many enterprises choose to outsource their supply chain management (SCM and leave it to a third-party organization in order to improve their services. The paper proposes an integrated and multicriteria tool useful to monitor and to improve performance in an outsourced supply chain. The Analytic Balanced Scorecard method (A-BSC is proposed as an effective method useful to analyze strategic performance within an outsourced supply chain. The aim of the paper is to present the integration of two methodologies: Balanced Scorecard, a multiple perspective framework for performance assessment, and Analytic Hierarchy Process, a decision-making tool used to prioritize multiple performance perspectives and to generate a unified metric. The development of the framework is aimed to provide a performance analysis to achieve better sustainability performance of supply chain. A real case study concerning a typical value chain is presented.

  18. NASTRAN as an analytical research tool for composite mechanics and composite structures

    Science.gov (United States)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  19. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  20. Balanced Scorecard – Sustainable Development Tool

    OpenAIRE

    Leontina Beţianu; Sorin Briciu

    2011-01-01

    The sustainable management of a business requires the consideration of all the business compo- nents, both the economic activity and the aspects related to its impact on the environment and its social implications. The Balanced Scorecard (BSC) is a management tool supporting the successful implementation of corporative strategies. This helps connecting operational and non-financial activi- ties that have a significant impact on the economic success of a business. BSC is therefore a promising ...

  1. PLS2 regression as a tool for selection of optimal analytical modality

    DEFF Research Database (Denmark)

    Madsen, Michael; Esbensen, Kim

    , analytical modalities. We here present results from a feasibility study, where Fourier Transform Near InfraRed (FT-NIR), Fourier Transform Mid InfraRed (FT-MIR), and Raman laser spectroscopy were applied on the same set of samples obtained from a pilot-scale beer brewing process. Quantitative PLS1 models...

  2. Solvent-free microwave extraction of bioactive compounds provides a tool for green analytical chemistry

    OpenAIRE

    Ying LI; Fabiano-Tixier, Anne-Sylvie; Vian, Maryline; Chemat, Farid

    2013-01-01

    We present an overview on solvent-free microwave-extraction techniques of bioactive compounds from natural products. This new technique is based on the concept of green analytical chemistry. It has proved to be an alternative to other techniques with the advantages of reducing extraction times, energy consumption, solvent use and CO2 emissions.

  3. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  4. Developing A SPOT CRM Debriefing Tool

    Science.gov (United States)

    Martin, Lynne; Villeda, Eric; Orasanu, Judith; Connors, Mary M. (Technical Monitor)

    1998-01-01

    In a study of CRM LOFT briefings published in 1997, Dismukes, McDonnell & Jobe reported that briefings were not being utilized as fully as they could be and that crews may not be getting the full benefit from LOFT that is possible. On the basis of their findings, they suggested a set of general guidelines for briefings for the industry. Our work builds on this study to try to provide a specific debriefing tool which provides a focus for the strategies that Dismukes et al suggest.

  5. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    Science.gov (United States)

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons. PMID:25350022

  6. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  7. Apply Web-based Analytic Tool and Eye Tracking to Study The Consumer Preferences of DSLR Cameras

    Directory of Open Access Journals (Sweden)

    Jih-Syongh Lin

    2013-11-01

    Full Text Available Consumer’s preferences and purchase motivation of products often lie in the purchasing behaviors generated by the synthetic evaluation of form features, color, function, and price of products. If an enterprise can bring these criteria under control, they can grasp the opportunities in the market place. In this study, the product form, brand, and prices of five DSLR digital cameras of Nikon, Lumix, Pentax, Sony, and Olympus were investigated from the image evaluation and eye tracking. The web-based 2-dimensional analytical tool was used to present information on three layers. Layer A provided information of product form and brand name; Layer B for product form, brand name, and product price for the evaluation of purchase intention (X axis and product form attraction (Y axis. On Layer C, Nikon J1 image samples of five color series were presented for the evaluation of attraction and purchase intention. The study results revealed that, among five Japanese brands of digital cameras, LUMIX GF3 is most preferred and serves as the major competitive product, with a product price of US$630. Through the visual focus of eye-tracking, the lens, curvatured handle bar, the curve part and shuttle button above the lens as well as the flexible flash of LUMIX GF3 are the parts that attract the consumer’s eyes. From the verbal descriptions, it is found that consumers emphasize the functions of 3D support lens, continuous focusing in shooting video, iA intelligent scene mode, and all manual control support. In the color preference of Nikon J1, the red and white colors are most preferred while pink is least favored. These findings can serve as references for designers and marketing personnel in new product design and development.

  8. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  9. 100-F Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  10. 100-K Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  11. Some key issues in the development of ergonomic intervention tools

    DEFF Research Database (Denmark)

    Edwards, Kasper; Winkel, Jørgen

    2016-01-01

    Literature reviews suggest that tools facilitating the ergonomic intervention processes should be integrated into rationalization tools, particular if such tools are participative. Such a Tool has recently been developed as an add-in module to the Lean tool “Value Stream Mapping” (VSM). However......, in the investigated context this module seems not to have any direct impact on the generation of proposals with ergonomic consideration. Contextual factors of importance seem to be e.g. allocation of sufficient resources and if work environment issues are generally accepted as part of the VSM methodology...

  12. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. T

  13. Development of ecohydrological assessment tool and its application

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The development of Hydro-Informatic Modelling System (HIMS) provides an integrated platform for hydrological simulation. To extend the application of HIMS, an ecohydrological modeling system named ecohydrological assessment tool (EcoHAT) has been developed. Integrating parameter-management tools, RS (remote sensing) inversion tools, module-design tools and GIS analysis tools, the EcoHAT provides an integrated tool to simulate ecohydrological processes on regional scale, which develops a new method on sustainable use of water. EcoHAT has been applied to several case studies, such as, the Yellow River Basin, the acid deposition area in Guizhou province and the riparian catchment of Guanting reservoir in Beijing. Results prove that EcoHAT can efficiently simulate and analysis the ecohydrological processes on regional scale and provide technical support to integrated water resources management on basin scale.

  14. Development of ecohydrological assessment tool and its application

    Institute of Scientific and Technical Information of China (English)

    LIU ChangMing; YANG ShengTian; WEN ZhiQun; WANG XueLei; WANG YuJuan; LI Qian; SHENG HaoRan

    2009-01-01

    The development of Hydro-Informatic Modelling System (HIMS) provides an integrated platform for hydrological simulation. To extend the application of HIMS, an ecohydrological modeling system named ecohydrological assessment tool (EcoHAT) has been developed, integrating parameter-management tools, RS (remote sensing) inversion tools, module-design tools and GIS analysis tools, the EcoHAT provides an integrated tool to simulate ecohydrological processes on regional scale, which develops a new method on sustainable use of water. EcoHAT has been applied to several case studies,such as, the Yellow River Basin, the acid deposition area in Guizhou province and the riparian catchment of Guanting reservoir in Beijing. Results prove that EcoHAT can efficiently simulate and analysis the ecohydrological processes on regional scale and provide technical support to integrated water resources management on basin scale.

  15. Developing molecular tools for Chlamydomonas reinhardtii

    Science.gov (United States)

    Noor-Mohammadi, Samaneh

    Microalgae have garnered increasing interest over the years for their ability to produce compounds ranging from biofuels to neutraceuticals. A main focus of researchers has been to use microalgae as a natural bioreactor for the production of valuable and complex compounds. Recombinant protein expression in the chloroplasts of green algae has recently become more routine; however, the heterologous expression of multiple proteins or complete biosynthetic pathways remains a significant challenge. To take full advantage of these organisms' natural abilities, sophisticated molecular tools are needed to be able to introduce and functionally express multiple gene biosynthetic pathways in its genome. To achieve the above objective, we have sought to establish a method to construct, integrate and express multigene operons in the chloroplast and nuclear genome of the model microalgae Chlamydomonas reinhardtii. Here we show that a modified DNA Assembler approach can be used to rapidly assemble multiple-gene biosynthetic pathways in yeast and then integrate these assembled pathways at a site-specific location in the chloroplast, or by random integration in the nuclear genome of C. reinhardtii. As a proof of concept, this method was used to successfully integrate and functionally express up to three reporter proteins (AphA6, AadA, and GFP) in the chloroplast of C. reinhardtii and up to three reporter proteins (Ble, AphVIII, and GFP) in its nuclear genome. An analysis of the relative gene expression of the engineered strains showed significant differences in the mRNA expression levels of the reporter genes and thus highlights the importance of proper promoter/untranslated-region selection when constructing a target pathway. In addition, this work focuses on expressing the cofactor regeneration enzyme phosphite dehydrogenase (PTDH) in the chloroplast and nuclear genomes of C. reinhardtii. The PTDH enzyme converts phosphite into phosphate and NAD(P)+ into NAD(P)H. The reduced

  16. Analytical Method Development & Validation for Related Substances Method of Busulfan Injection by Ion Chromatography Method

    Directory of Open Access Journals (Sweden)

    Rewaria S

    2013-05-01

    Full Text Available A new simple, accurate, precise and reproducible Ion chromatography method has been developed forthe estimation of Methane sulfonic acid in Busulfan injectable dosage. The method which is developedis also validated in complete compliance with the current regulatory guidelines by using well developedanalytical method validation techniques and tools which comprises with the analytical method validationparameters like Linearity, LOD and LOQ determination, Accuracy, Method precision, Specificity,System suitability, Robustness, Ruggedness etc. by adopting the current method the linearity obtained isnear to 0.999 and thus this shows that the method is capable to give a good detector response, therecovery calculated was within the range of 85% to 115% of the specification limits.

  17. Analytical developments in ICP-MS for arsenic and selenium speciation. Application to granitic waters

    International Nuclear Information System (INIS)

    Nuclear waste storage in geological areas needs the understanding of the physico-chemistry of groundwaters interactions with surrounding rocks. Redox potential measurements and speciation, calculated from geochemical modelling are not significant for the determination of water reactivity. We have thus chosen to carry out experimental speciation by developing sensitive analytical tools with respect of specie chemical identity. We have studied two redox indicators from reference sites (thermal waters from Pyrenees, France): arsenic and selenium. At first, we have determined the concentrations in major ions (sulphide, sulphate, chloride, fluoride, carbonate, Na, K, Ca). Speciation was conducted by HPLC hyphenated to quadrupole ICP-MS and high resolution ICP-MS. These analyses have shown the presence of two new arsenic species in solution, in addition of a great reactivity of these waters during stability studies. A sampling, storage and analysis method is described. (author)

  18. [COMETE: a tool to develop psychosocial competences in patient education].

    Science.gov (United States)

    Saugeron, Benoit; Sonnier, Pierre; Marchais, Stéphanie

    2016-01-01

    This article presents a detailed description of the development and use of the COMETE tool. The COMETE tool is designed to help medical teams identify, develop or evaluate psychosocial skills in patient education and counselling. This tool, designed in the form of a briefcase, proposes methodological activities and cards that assess psychosocial skills during a shared educational assessment, group meetings or during an individual evaluation. This tool is part of a support approach for medical teams caring for patients with chronic diseases. PMID:27392049

  19. Developing a 300C Analog Tool for EGS

    Energy Technology Data Exchange (ETDEWEB)

    Normann, Randy

    2015-03-23

    This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.

  20. DEVELOPMENT OF SOLUBILITY PRODUCT VISUALIZATION TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    T.F. Turner; A.T. Pauli; J.F. Schabron

    2004-05-01

    Western Research Institute (WRI) has developed software for the visualization of data acquired from solubility tests. The work was performed in conjunction with AB Nynas Petroleum, Nynashamn, Sweden who participated as the corporate cosponsor for this Jointly Sponsored Research (JSR) task. Efforts in this project were split between software development and solubility test development. The Microsoft Windows-compatible software developed inputs up to three solubility data sets, calculates the parameters for six solid body types to fit the data, and interactively displays the results in three dimensions. Several infrared spectroscopy techniques have been examined for potential use in determining bitumen solubility in various solvents. Reflectance, time-averaged absorbance, and transmittance techniques were applied to bitumen samples in single and binary solvent systems. None of the techniques were found to have wide applicability.

  1. DEVELOPING A TOOL FOR ENVIRONMENTALLY PREFERABLE PURCHASING

    Science.gov (United States)

    LCA-based guidance was developed by EPA under the Framework for Responsible Environmental Decision Making (FRED) effort to demonstrate how to conduct a relative comparison between product types to determine environmental preferability. It identifies data collection needs and iss...

  2. Towards a Process for Developing Maintenance Tools in Academia

    CERN Document Server

    Kienle, Holger M

    2008-01-01

    Building of tools--from simple prototypes to industrial-strength applications--is a pervasive activity in academic research. When proposing a new technique for software maintenance, effective tool support is typically required to demonstrate the feasibility and effectiveness of the approach. However, even though tool building is both pervasive and requiring significant time and effort, it is still pursued in an ad hoc manner. In this paper, we address these issues by proposing a dedicated development process for tool building that takes the unique characteristics of an academic research environment into account. We first identify process requirements based on a review of the literature and our extensive tool building experience in the domain of maintenance tools. We then outline a process framework based on work products that accommodates the requirements while providing needed flexibility for tailoring the process to account for specific tool building approaches and project constraints. The work products are...

  3. 78 FR 68459 - Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug...

    Science.gov (United States)

    2013-11-14

    ... guidance to FDA staff, industry, healthcare providers, researchers, and patient and consumer groups on a... HUMAN SERVICES Food and Drug Administration Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug Administration Staff; Availability AGENCY: Food and...

  4. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    Science.gov (United States)

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  5. Networks as Tools for Sustainable Urban Development

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    . By applying the GREMI2-theories of “innovative milieux” (Aydalot, 1986; Camagni, 1991) to the case study, we will suggest some reasons for the benefits achieved by the Dogme-network, compared to other networks. This analysis will point to the existence of an “innovative milieu” on sustainability within......, strategies and actions. There has been little theoretically development on the subject. In practice networks for sustainable development can be seen as combining different theoretical approaches to networks, including governance, urban competition and innovation. To give a picture of the variety...

  6. Tools for Nanotechnology Education Development Program

    Energy Technology Data Exchange (ETDEWEB)

    Dorothy Moore

    2010-09-27

    The overall focus of this project was the development of reusable, cost-effective educational modules for use with the table top scanning electron microscope (TTSEM). The goal of this project's outreach component was to increase students' exposure to the science and technology of nanoscience.

  7. Appreciative Inquiry as an Organizational Development Tool.

    Science.gov (United States)

    Martinetz, Charles F.

    2002-01-01

    Defines appreciative inquiry as a change model that uses traditional organizational development processes (team building, strategic planning, business process redesign, management audits) in a new way, both as a philosophy and as a process. Emphasizes collaboration, participation of all voices, and changing the organization rather than the people.…

  8. Development and testing of analytical models for the pebble bed type HTRs

    International Nuclear Information System (INIS)

    The pebble bed type gas cooled high temperature reactor (HTR) appears to be a good candidate for the next generation nuclear reactor technology. These reactors have unique characteristics in terms of the randomness in geometry, and require special techniques to analyze their systems. This study includes activities concerning the testing of computational tools and the qualification of models. Indeed, it is essential that the validated analytical tools be available to the research community. From this viewpoint codes like MCNP, ORIGEN and RELAP5, which have been used in nuclear industry for many years, are selected to identify and develop new capabilities needed to support HTR analysis. The geometrical model of the full reactor is obtained by using lattice and universe facilities provided by MCNP. The coupled MCNP-ORIGEN code is used to estimate the burnup and the refuelling scheme. Results obtained from Monte Carlo analysis are interfaced with RELAP5 to analyze the thermal hydraulics and safety characteristics of the reactor. New models and methodologies are developed for several past and present experimental and prototypical facilities that were based on HTR pebble bed concepts. The calculated results are compared with available experimental data and theoretical evaluations showing very good agreement. The ultimate goal of the validation of the computer codes for pebble bed HTR applications is to acquire and reinforce the capability of these general purpose computer codes for performing HTR core design and optimization studies

  9. Selection of reference standard during method development using the analytical hierarchy process.

    Science.gov (United States)

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. PMID:25636165

  10. Selection of reference standard during method development using the analytical hierarchy process.

    Science.gov (United States)

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development.

  11. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    Science.gov (United States)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  12. Green certificates, a tool for market development

    International Nuclear Information System (INIS)

    To achieve a place for renewable energy the Government of the Netherlands has followed a market oriented approach. In view of the rapidly emerging liberalized energy market the government followed an approach with both support to producers and a demand-driven approach. With a fully liberalized market for green electricity with free consumer choice and the tradable certificate for renewable energy a market has been developed. In view of the slow domestic growth in production a new support mechanism was introduced called the environmental quality of power production (MEP) for renewable electricity in the Netherlands in 2003. This paper evaluates the market development over the last years with the green certificate system and the rapid growing market of green electricity. In 2004 the green certificate is EU-wide replaced by the Certificate of Origin. (author)

  13. Developing a coupled analytical model for analyzing salt intrusion in alluvial estuaries

    Science.gov (United States)

    Savenije, H.; CAI, H.; Gisen, J.

    2013-12-01

    A predictive assessment technique to estimate the salt intrusion length and longitudinal salinity distribution in estuaries is important for policy makers and managers to maintain a healthy estuarine environment. In this study, the salt intrusion model of Savenije (2005, 2012) is applied and coupled to an explicit solution for tidal dynamics developed by Cai and Savenije (2013). The objective of the coupling is to reduce the number of calibration parameters, which subsequently strengthens the reliability of the salt intrusion model. Moreover, the fully analytical treatment allows assessing the effect of model forcing (i.e., tide and river discharge) and geometry adjustments (e.g., by dredging) on system performance. The coupled model has been applied to a wide range of estuaries, and the result shows that the correspondence between analytical estimations and observations is very good. As a result, the coupled model is a useful tool for decision makers to obtain first order estimates of salt intrusion in estuaries based on a minimum of information required. References Savenije, H.H.G. (2005), Salinity and Tides in Alluvial Estuaries, Elsevier. Savenije, H.H.G. (2012), Salinity and Tides in Alluvial Estuaries, completely revised 2nd edition, www.salinityandtides.com. Cai, H., and H. H. G. Savenije (2013), Asymptotic behavior of tidal damping in alluvial estuaries, Journal of Geophysical Research, submitted.

  14. Development of analytical techniques of vanadium isotope in seawater

    Science.gov (United States)

    Huang, T.; Owens, J. D.; Sarafian, A.; Sen, I. S.; Huang, K. F.; Blusztajn, J.; Nielsen, S.

    2015-12-01

    Vanadium (V) is a transition metal with isotopes of 50V and 51V, and oxidation states of +2, +3, +4 and +5. The average concentration in seawater is 1.9 ppb, which results in a marine residence time of ~50 kyrs. Its various oxidation states make it a potential tool for investigating redox conditions in the ocean and sediments due to redox related changes in the valance state of vanadium. In turn, chemical equilibrium between different oxidation states of V will likely cause isotopic fractionation that can potentially be utilized to quantify past ocean redox states. In order to apply V isotopes as a paleo-redox tracer, it is required that we know the isotopic composition of seawater and the relation to marine sources and sinks of V. We developed a novel method for pre-concentrating V and measuring the isotope ratio in seawater samples. In our method, we used four ion exchange chromatography columns to separate vanadium from seawater matrix elements, in particular titanium and chromium, which both have an isobaric interference on 50V. The first column uses the NOBIAS resin, which effectively separates V and other transition metals from the majority of seawater matrix. Subsequent columns are identical to those utilized when separating V from silicate samples (Nielsen et al, Geostand. Geoanal. Res., 2011). The isotopic composition of the purified V is measured using a Thermo Scientific Neptune multiple collector inductively coupled plasma mass spectrometer (MC-ICP-MS) in medium resolution mode. This setup resolves all molecular interferences from masses 49, 50, 51, 52 and 53 including S-O species on mass 50. To test the new method, we spiked an open ocean seawater sample from the Bermuda Atlantic Time Series (BATS) station with 10-25 μg of Alfa Aesar vanadium solution, which has an isotopic composition of δ51V = 0 [where δ51V = 1000 × [(51V/50Vsample - 51V/50VAA)/51V/50VAA]. The average of six spiked samples is -0.03±0.19‰, which is within error of the true

  15. Towards an interoperability ontology for software development tools

    OpenAIRE

    Hasni, Neji.

    2003-01-01

    Approved for public release; distribution is unlimited The automation of software development has long been a goal of software engineering to increase efficiency of the development effort and improve the software product. This efficiency (high productivity with less software faults) results from best practices in building, managing and tes ting software projects via the use of these automated tools and processes. However, each software development tool has its own characteristics, semantic...

  16. original: Multi-sectoral qualitative analysis: a tool for assessing the competitiveness of regions and formulating strategies for economic development

    OpenAIRE

    Brian Roberts; Stimson, Robert J

    1998-01-01

    Regional economic development strategy formulation relies heavily on analytical techniques such as shift share, location quotients, input-output and SWOT analysis. However, many of theses traditional tools are proving inadequate for understanding what makes regions competitive. New tools are required to evaluate the competitiveness of regional economies, how to gain competitive advantage, and what new management frameworks and enabling infrastructure are needed to drive economic development p...

  17. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O' Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review

  18. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  19. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  20. Development of a Test to Evaluate Students' Analytical Thinking Based on Fact versus Opinion Differentiation

    Science.gov (United States)

    Thaneerananon, Taveep; Triampo, Wannapong; Nokkaew, Artorn

    2016-01-01

    Nowadays, one of the biggest challenges of education in Thailand is the development and promotion of the students' thinking skills. The main purposes of this research were to develop an analytical thinking test for 6th grade students and evaluate the students' analytical thinking. The sample was composed of 3,567 6th grade students in 2014…

  1. Newspaper Reading among College Students in Development of Their Analytical Ability

    Science.gov (United States)

    Kumar, Dinesh

    2009-01-01

    The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…

  2. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang;

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...

  3. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    Directory of Open Access Journals (Sweden)

    Cui Chengsong

    2015-01-01

    Full Text Available In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming process was used to produce graded tool materials in the form of a flat product. The graded deposits were subsequently hot rolled and heat treated to achieve an optimal microstructure and advanced properties. Micro plunge rotary swaging tools with fine geometrical structures were machined from the hot rolled materials. The new forming tools were successfully applied in the micro plunge rotary swaging of wires of stainless steel.

  4. Development of micro rotary swaging tools of graded tool steel via co-spray forming

    Directory of Open Access Journals (Sweden)

    Cui Chengsong

    2015-01-01

    Full Text Available In order to meet the requirements of micro rotary swaging, the local properties of the tools should be adjusted properly with respect to abrasive and adhesive wear, compressive strength, and toughness. These properties can be optimally combined by using different materials in specific regions of the tools, with a gradual transition in between to reduce critical stresses at the interface during heat treatment and in the rotary swaging process. In this study, a newly developed co-spray forming process was used to produce graded tool material in the form of a flat product. The graded deposit was subsequently hot rolled and heat treated to achieve an optimal microstructure and advanced properties. Micro plunge rotary swaging tools with fine geometrical structures were machined from the hot rolled material. The new forming tools were successfully applied in the micro plunge rotary swaging of wires of stainless steel.

  5. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  6. Assessment procedures and analytical tools for leak-before-break applications

    International Nuclear Information System (INIS)

    Leak-before-break assessment as part of power plant pipeline strength analysis uses either the yield stress criterion or fracture-mechanical methods by the FAD concept. In thelatter case, fracture-mechanical and strength data of the material are required as well as analytical equations for calculating the stress intensity factor Kl and the plastic limiting load Lr. The application of verified and generally valid Kl and Ll solutions is of vast importance. The contribution compares selected advanced stress intensity factor solutions for cylinder with surface cracks and through cracks. Apart from the limits of application of solutions for the geometry and load parameters, also their accuracy is assessed. For this, a method for estimating numeric errors of Kl solutions is presented and is applied to a series of solutions. Equations are presented for the plastic limiting load resp. the parameter Lr. The application of the calculation methods is demonstrated for a pipeline using the current version of the failure assessment programme VERB. (orig.)

  7. Designing a tool for curriculum leadership development in postgraduate programs

    Directory of Open Access Journals (Sweden)

    M Avizhgan

    2016-07-01

    Full Text Available Introduction: Leadership in the area of curriculum development is increasingly important as we look for ways to improve our programmes and practices. In curriculum studies, leadership has received little attention. Considering the lack of an evaluation tool with objective criteria in postgraduate curriculum leadership process, this study aimed to design a specific tool and determine the validity and reliability of the tool. Method: This study is a methodological research.  At first, domains and items of the tool were determined through expert interviews and literature review. Then, using Delphi technique, 54 important criteria were developed. A panel of experts was used to confirm content and face validity. Reliability was determined by a descriptive study in which 30 faculties from two of Isfahan universities and was estimated by internal consistency. The data were analyzed by SPSS software, using Pearson Correlation Coefficient and reliability analysis. Results: At first, considering the definition of curriculum leadership determined the domains and items of the tool and they were developed primary tool. Expert’s faculties’ views were used in deferent stages of development and psychometry. The tool internal consistency with Cronbach's alpha coefficient times was 96.5. This was determined for each domain separately. Conclution: Applying this instrument can improve the effectiveness of curriculum leadership. Identifying the characteristics of successful and effective leaders, and utilizing this knowledge in developing and implementing curriculum might help us to have better respond to the changing needs of our students, teachers and schools of tomorrow.

  8. Selecting Appropriate Requirements Management Tool for Developing Secure Enterprises Software

    Directory of Open Access Journals (Sweden)

    Daniyal M Alghazzawi

    2014-03-01

    Full Text Available This paper discusses about the significance of selecting right requirements management tools. It’s no secret that poorly understood user requirements and uncontrolled scope creep to many software project failures. Many of the application development professionals buy wrong tools for the wrong reasons. To avoid purchasing the more complex and expensive tool, the organization needs to be realistic about the particular problem for which they opt. Software development organizations are improving the methods, they use to gather, analyze, trace, document, prioritize and manage their requirements. This paper considers four leading Requirements Management tools; Analyst Pro, CORE, Cradle and Caliber RM, the focus is to select the appropriate tool according to their capabilities and customers need.

  9. A graphical tool for an analytical approach of scattering photons by the Compton effect

    Energy Technology Data Exchange (ETDEWEB)

    Scannavino, Francisco A., E-mail: scannavino@usp.br [Embrapa Agricultural Instrumentation Center, Sao Carlos (Brazil); Physics Institute of Sao Carlos-IFSC, University of Sao Paulo, Sao Carlos (Brazil); Cruvinel, Paulo E., E-mail: cruvinel@cnpdia.embrapa.br [Embrapa Agricultural Instrumentation Center, Sao Carlos (Brazil); Physics Institute of Sao Carlos-IFSC, University of Sao Paulo, Sao Carlos (Brazil)

    2012-05-11

    The photons scattered by the Compton effect can be used to characterize the physical properties of a given sample due to the influence that the electron density exerts on the number of scattered photons. However, scattering measurements involve experimental and physical factors that must be carefully analyzed to predict uncertainty in the detection of Compton photons. This paper presents a method for the optimization of the geometrical parameters of an experimental arrangement for Compton scattering analysis, based on its relations with the energy and incident flux of the X-ray photons. In addition, the tool enables the statistical analysis of the information displayed and includes the coefficient of variation (CV) measurement for a comparative evaluation of the physical parameters of the model established for the simulation.

  10. A graphical tool for an analytical approach of scattering photons by the Compton effect

    Science.gov (United States)

    Scannavino, Francisco A.; Cruvinel, Paulo E.

    2012-05-01

    The photons scattered by the Compton effect can be used to characterize the physical properties of a given sample due to the influence that the electron density exerts on the number of scattered photons. However, scattering measurements involve experimental and physical factors that must be carefully analyzed to predict uncertainty in the detection of Compton photons. This paper presents a method for the optimization of the geometrical parameters of an experimental arrangement for Compton scattering analysis, based on its relations with the energy and incident flux of the X-ray photons. In addition, the tool enables the statistical analysis of the information displayed and includes the coefficient of variation (CV) measurement for a comparative evaluation of the physical parameters of the model established for the simulation.

  11. Psychometric properties of a Mental Health Team Development Audit Tool.

    LENUS (Irish Health Repository)

    Roncalli, Silvia

    2013-02-01

    To assist in improving team working in Community Mental Health Teams (CMHTs), the Mental Health Commission formulated a user-friendly but yet-to-be validated 25-item Mental Health Team Development Audit Tool (MHDAT).

  12. Challenges in the development of analytical soil compaction models

    DEFF Research Database (Denmark)

    Keller, Thomas; Lamandé, Mathieu

    2010-01-01

    data and model simulations. The upper model boundary condition (i.e. contact area and stresses at the tyre-soil interface) is highly influential in stress propagation, but knowledge on the effects of loading and soil conditions on the upper model boundary condition is inadequate. The accuracy of stress...... transducers and therefore of stress measurements is not well known, despite numerous studies on stress in the soil profile below agricultural tyres. Although arable soils are characterised by distinct soil layers with different mechanical properties, analytical models rely on a one-layer approach with regard......Soil compaction can cause a number of environmental and agronomic problems (e.g. flooding, erosion, leaching of agrochemicals to recipient waters, emission of greenhouse gases to the atmosphere, crop yield losses), resulting in significant economic damage to society and agriculture. Strategies...

  13. Development of a Public Health Assessment Tool to Prevent Lyme Disease: Tool Construction and Validation

    OpenAIRE

    Garvin, Jennifer Hornung; Gordon, Thomas F.; Haignere, Clara; DuCette, Joseph P

    2005-01-01

    This study involved the design and validation of a new Lyme disease risk assessment instrument. The study was funded in part by a research grant from the American Health Information Management Association (AHIMA) Foundation on Research and Education (FORE). The resulting instrument measured theoretical constructs such as attitudes, behaviors, beliefs, skills, and knowledge relative to Lyme disease. The survey assessment tool is described here, and the tool development process, the validation ...

  14. Disaster Risk Finance as a Tool for Development

    OpenAIRE

    World Bank Group

    2016-01-01

    Since 2013 The World Bank Group has partnered with the Global Facility for Disaster Reduction and Recovery and the U.K. Department for International Development to address some of these gaps in evidence and methodologies. The Disaster Risk Finance Impact Analytics Project has made significant contributions to the understanding of how to monitor and evaluate existing or potential investment...

  15. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  16. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP)

  17. gOntt: a Tool for Scheduling Ontology Development Projects

    OpenAIRE

    A. GÓMEZ-PÉREZ; Suárez-Figueroa, Mari Carmen; Vigo, Martin

    2009-01-01

    The Ontology Engineering field lacks tools that guide ontology developers to plan and schedule their ontology development projects. gOntt helps ontology developers in two ways: (a) to schedule ontology projects; and (b) to execute such projects based on the schedule and using the NeOn Methodology.

  18. DEVELOPMENT OF REMOTE HANFORD CONNECTOR GASKET REPLACEMENT TOOLING FOR DWPF

    Energy Technology Data Exchange (ETDEWEB)

    Krementz, D.; Coughlin, Jeffrey

    2009-05-05

    The Defense Waste Processing Facility (DWPF) requested the Savannah River National Laboratory (SRNL) to develop tooling and equipment to remotely replace gaskets in mechanical Hanford connectors to reduce personnel radiation exposure as compared to the current hands-on method. It is also expected that radiation levels will continually increase with future waste streams. The equipment is operated in the Remote Equipment Decontamination Cell (REDC), which is equipped with compressed air, two master-slave manipulators (MSM's) and an electro-mechanical manipulator (EMM) arm for operation of the remote tools. The REDC does not provide access to electrical power, so the equipment must be manually or pneumatically operated. The MSM's have a load limit at full extension of ten pounds, which limited the weight of the installation tool. In order to remotely replace Hanford connector gaskets several operations must be performed remotely, these include: removal of the spent gasket and retaining ring (retaining ring is also called snap ring), loading the new snap ring and gasket into the installation tool and installation of the new gasket into the Hanford connector. SRNL developed and tested tools that successfully perform all of the necessary tasks. Removal of snap rings from horizontal and vertical connectors is performed by separate air actuated retaining ring removal tools and is manipulated in the cell by the MSM. In order install a new gasket, the snap ring loader is used to load a new snap ring into a groove in the gasket installation tool. A new gasket is placed on the installation tool and retained by custom springs. An MSM lifts the installation tool and presses the mounted gasket against the connector block. Once the installation tool is in position, the gasket and snap ring are installed onto the connector by pneumatic actuation. All of the tools are located on a custom work table with a pneumatic valve station that directs compressed air to the desired

  19. Comprehensive analytical strategy for biomarker identification based on liquid chromatography coupled to mass spectrometry and new candidate confirmation tools.

    Science.gov (United States)

    Mohamed, Rayane; Varesio, Emmanuel; Ivosev, Gordana; Burton, Lyle; Bonner, Ron; Hopfgartner, Gérard

    2009-09-15

    A comprehensive analytical LC-MS(/MS) platform for low weight biomarkers molecule in biological fluids is described. Two complementary retention mechanisms were used in HPLC by optimizing the chromatographic conditions for a reversed-phase column and a hydrophilic interaction chromatography column. LC separation was coupled to mass spectrometry by using an electrospray ionization operating in positive polarity mode. This strategy enables us to correctly retain and separate hydrophobic as well as polar analytes. For that purpose artificial model study samples were generated with a mixture of 38 well characterized compounds likely to be present in biofluids. The set of compounds was used as a standard aqueous mixture or was spiked into urine at different concentration levels to investigate the capability of the LC-MS(/MS) platform to detect variations across biological samples. Unsupervised data analysis by principal component analysis was performed and followed by principal component variable grouping to find correlated variables. This tool allows us to distinguish three main groups whose variables belong to (a) background ions (found in all type of samples), (b) ions distinguishing urine samples from aqueous standard and blank samples, (c) ions related to the spiked compounds. Interpretation of these groups allows us to identify and eliminate isotopes, adducts, fragments, etc. and to generate a reduced list of m/z candidates. This list is then submitted to the prototype MZSearcher software tool which simultaneously searches several lists of potential metabolites extracted from metabolomics databases (e.g., KEGG, HMDB, etc) to propose biomarker candidates. Structural confirmation of these candidates was done off-line by fraction collection followed by nanoelectrospray infusion to provide high quality MS/MS data for spectral database queries. PMID:19702294

  20. Electrochemical treatment of olive mill wastewater: Treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools

    Institute of Scientific and Technical Information of China (English)

    Chokri Belaid; Moncef Khadraoui; Salma Mseddi; Monem Kallel; Boubaker Elleuch; Jean Francois Fauvarque

    2013-01-01

    Problems related with industrials effluents can be divided in two parts:(1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes.This investigation deals with these two aspects,an electrochemical treatment method of an olive mill wastewater (OMW) under pla ttmized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination.The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution.Indeed,87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation.Moreover,55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced.On the other hand,UV-Visible spectrophotometry,Gaz chromatography/mass spectrometry,cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR)showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW.It was concluded that electrochemical oxidation in a modified Gaignard reactor is a promising process for the destruction of all phenolic compounds present in OMW.Among the monitoring analytical tools applied,cyclic voltammetry and 13C NMR are among the techniques that are introduced for the first time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  1. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  2. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  3. The management and exploitation of naturally light-emitting bacteria as a flexible analytical tool: A tutorial.

    Science.gov (United States)

    Bolelli, L; Ferri, E N; Girotti, S

    2016-08-31

    Conventional detection of toxic contaminants on surfaces, in food, and in the environment takes time. Current analytical approaches to chemical detection can be of limited utility due to long detection times, high costs, and the need for a laboratory and trained personnel. A non-specific but easy, rapid, and inexpensive screening test can be useful to quickly classify a specimen as toxic or non toxic, so prompt appropriate measures can be taken, exactly where required. The bioluminescent bacteria-based tests meet all these characteristics. Bioluminescence methods are extremely attractive because of their high sensitivity, speed, ease of implementation, and statistical significance. They are usually sensitive enough to detect the majority of pollutants toxic to humans and mammals. This tutorial provides practical guidelines for isolating, cultivating, and exploiting marine bioluminescent bacteria as a simple and versatile analytical tool. Although mostly applied for aqueous phase sample and organic extracts, the test can also be conducted directly on soil and sediment samples so as to reflect the true toxicity due to the bioavailability fraction. Because tests can be performed with freeze-dried cell preparations, they could make a major contribution to field screening activity. They can be easily conducted in a mobile environmental laboratory and may be adaptable to miniaturized field instruments and field test kits. PMID:27506340

  4. Atomic force microscopy as analytical tool to study physico-mechanical properties of intestinal cells

    Directory of Open Access Journals (Sweden)

    Christa Schimpel

    2015-07-01

    Full Text Available The small intestine is a complex system that carries out various functions. The main function of enterocytes is absorption of nutrients, whereas membranous cells (M cells are responsible for delivering antigens/foreign substances to the mucosal lymphoid tissues. However, to get a fundamental understanding of how cellular structures contribute to physiological processes, precise knowledge about surface morphologies, cytoskeleton organizations and biomechanical properties is necessary. Atomic force microscopy (AFM was used here as a powerful tool to study surface topographies of Caco-2 cells and M cells. Furthermore, cell elasticity (i.e., the mechanical response of a cell on a tip indentation, was elucidated by force curve measurements. Besides elasticity, adhesion was evaluated by recording the attraction and repulsion forces between the tip and the cell surface. Organization of F-actin networks were investigated via phalloidin labeling and visualization was performed with confocal laser scanning fluorescence microscopy (CLSM and scanning electron microscopy (SEM. The results of these various experimental techniques revealed significant differences in the cytoskeleton/microvilli arrangements and F-actin organization. Caco-2 cells displayed densely packed F-actin bundles covering the entire cell surface, indicating the formation of a well-differentiated brush border. In contrast, in M cells actins were arranged as short and/or truncated thin villi, only available at the cell edge. The elasticity of M cells was 1.7-fold higher compared to Caco-2 cells and increased significantly from the cell periphery to the nuclear region. Since elasticity can be directly linked to cell adhesion, M cells showed higher adhesion forces than Caco-2 cells. The combination of distinct experimental techniques shows that morphological differences between Caco-2 cells and M cells correlate with mechanical cell properties and provide useful information to understand

  5. Incremental visual text analytics of news story development

    Science.gov (United States)

    Krstajic, Milos; Najm-Araghi, Mohammad; Mansmann, Florian; Keim, Daniel A.

    2012-01-01

    Online news sources produce thousands of news articles every day, reporting on local and global real-world events. New information quickly replaces the old, making it difficult for readers to put current events in the context of the past. Additionally, the stories have very complex relationships and characteristics that are difficult to model: they can be weakly or strongly connected, or they can merge or split over time. In this paper, we present a visual analytics system for exploration of news topics in dynamic information streams, which combines interactive visualization and text mining techniques to facilitate the analysis of similar topics that split and merge over time. We employ text clustering techniques to automatically extract stories from online news streams and present a visualization that: 1) shows temporal characteristics of stories in different time frames with different level of detail; 2) allows incremental updates of the display without recalculating the visual features of the past data; 3) sorts the stories by minimizing clutter and overlap from edge crossings. By using interaction, stories can be filtered based on their duration and characteristics in order to be explored in full detail with details on demand. To demonstrate the usefulness of our system, case studies with real news data are presented and show the capabilities for detailed dynamic text stream exploration.

  6. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  7. New research directions in the development of analytical chemistry

    OpenAIRE

    Rema Matakova

    2016-01-01

    The article shows that discovering nanoscale elements made it possible to synthesize new chemical compounds without chemical reaction and defined the basis of effective development of nanoanalytical chemistry in the past two decades. The article focuses on the prospective development of bioanalytical chemistry, based on reagentless sensory methods of analysis of biochemical processes to cure fast dangerous infections of the century. Unusual opportunity of development of «green» chemistr...

  8. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    Science.gov (United States)

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  9. Knowledge-based geographic information systems (KBGIS): new analytic and data management tools

    Energy Technology Data Exchange (ETDEWEB)

    Albert, T.M.

    1988-11-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the US Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved.

  10. ISS Biotechnology Facility - Overview of Analytical Tools for Cellular Biotechnology Investigations

    Science.gov (United States)

    Jeevarajan, A. S.; Towe, B. C.; Anderson, M. M.; Gonda, S. R.; Pellis, N. R.

    2001-01-01

    The ISS Biotechnology Facility (BTF) platform provides scientists with a unique opportunity to carry out diverse experiments in a microgravity environment for an extended period of time. Although considerable progress has been made in preserving cells on the ISS for long periods of time for later return to Earth, future biotechnology experiments would desirably monitor, process, and analyze cells in a timely way on-orbit. One aspect of our work has been directed towards developing biochemical sensors for pH, glucose, oxygen, and carbon dioxide for perfused bioreactor system developed at Johnson Space Center. Another aspect is the examination and identification of new and advanced commercial biotechnologies that may have applications to on-orbit experiments.

  11. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  12. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  13. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  14. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    Startups play a crucial role in exploiting the commercial advantages created by new, advanced technologies. Surprisingly, the processes by which the entrepreneur commercialises these technologies are largely undescribed - partly due to the absence of appropriate process data capture tools. This...... paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  15. Developing e-marketing tools : Case company: CASTA Ltd.

    OpenAIRE

    Nguyen, Chi

    2014-01-01

    This Bachelor’s thesis topic is developing e-marketing tools for the B2C sector of CASTA Ltd. The final outcome is a set of online marketing tools guidelines that can improve business activities, especially marketing effectiveness. Based on the company’s status as a novice in online marketing field, the thesis will focus on the basic level of three specific online marketing tools, instead of covering the whole e-marketing subject. The theoretical framework first describes the concept of e...

  16. Integrated modelling as an analytical and optimisation tool for urban watershed management.

    Science.gov (United States)

    Erbe, V; Frehmann, T; Geiger, W F; Krebs, P; Londong, J; Rosenwinkel, K H; Seggelke, K

    2002-01-01

    In recent years numerical modelling has become a standard procedure to optimise urban wastewater systems design and operation. Since the models were developed for the subsystems independently, they did not support an integrated view to the operation of the sewer system, the wastewater treatment plant (WWTP) and the receiving water. After pointing out the benefits of an integrated approach and the possible synergy effects that may arise from analysing the interactions across the interfaces, three examples of modelling case studies carried out in Germany are introduced. With these examples we intend to demonstrate the potential of integrated models, though their development cannot be considered completed. They are set up with different combinations of self-developed and commercially available software. The aim is to analyse fluxes through the total wastewater system or to integrate pollution-based control in the upstream direction, that is e.g. managing the combined water retention tanks as a function of state variables in the WWTP or the receiving water. Furthermore the interface between the sewer and the WWTP can be optimised by predictive simulations such that the combined water flow can be maximised according to the time- and dynamics-dependent state of the treatment processes. PMID:12380985

  17. Developing an Analytical Framework for Argumentation on Energy Consumption Issues

    Science.gov (United States)

    Jin, Hui; Mehl, Cathy E.; Lan, Deborah H.

    2015-01-01

    In this study, we aimed to develop a framework for analyzing the argumentation practice of high school students and high school graduates. We developed the framework in a specific context--how energy consumption activities such as changing diet, converting forests into farmlands, and choosing transportation modes affect the carbon cycle. The…

  18. Developing Tool Support for Problem Diagrams with CPN and VDM++

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe ongoing work on the development of tool support for formal description of domains found in Problem Diagrams. The purpose of the tool is to handle the generation of a CPN model based on a collection of Problem Diagrams. The Problem Diagrams are used for representing the ...... validated against structural constraints found in the Problem Diagrams. The generation and validation algorithms as well as the definitions of the two modeling formalisms are specified using VDM++....

  19. China adopts rural tourism as a development tool

    OpenAIRE

    Wo, Zhuo

    2006-01-01

    In recent years, rural tourism has become ever more prominent as a tool to increase visitors' awareness and as an attraction to a destination as well as a tool for economic development in the countryside of China. Rural tourism is a new type of tourism industry, which makes rural cmmunities as its sites, rural distinctive production, living styles and idyllic landscapes as its objects. The writer aims to analyze the theory of tourism life cycle proposed by Butler, current problems, types, mod...

  20. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  1. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  2. Twenty-one years of microemulsion electrokinetic chromatography (1991-2012): a powerful analytical tool.

    Science.gov (United States)

    Yang, Hua; Ding, Yao; Cao, Jun; Li, Ping

    2013-05-01

    Microemulsion electrokinetic chromatography (MEEKC) is a CE separation technique, which utilizes buffered microemulsions as the separation media. In the past two decades, MEEKC has blossomed into a powerful separation technique for the analysis of a wide range of compounds. Pseudostationary phase composition is so critical to successful resolution in EKC, and several variables could be optimized including surfactant/co-surfactant/oil type and concentration, buffer content, and pH value. Additionally, MEEKC coupled with online sample preconcentration approaches could significantly improve the detection sensitivity. This review comprehensively describes the development of MEEKC from the period 1991 to 2012. Areas covered include basic theory, microemulsion composition, improving resolution and enhancing sensitivity methods, detection techniques, and applications of MEEKC. PMID:23463608

  3. Analytical tools for the study of cellular glycosylation in the immune system

    Directory of Open Access Journals (Sweden)

    Yvette eVan Kooyk

    2013-12-01

    Full Text Available It is becoming increasingly clear that glycosylation plays important role in intercellular communication within the immune system. Glycosylation-dependent interactions are crucial for the innate and adaptive immune system and regulate immune cell trafficking, synapse formation, activation, and survival. These functions take place by the cis or trans interaction of lectins with glycans. Classical immunological and biochemical methods have been used for the study of lectin function; however, the investigation of their counterparts, glycans, requires very specialized methodologies that have been extensively developed in the past decade within the Glycobiology scientific community. This Mini-Review intends to summarize the available technology for the study of glycan biosynthesis, its regulation and characterization for their application to the study of glycans in Immunology.

  4. TENTube: A video-based connection tool supporting competence development

    NARCIS (Netherlands)

    Angehrn, Albert; Maxwell, Katrina

    2008-01-01

    Angehrn, A. A., & Maxwell, K. (2008). TENTube: A video-based connection tool supporting competence development. In H. W. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technologica

  5. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  6. Computer-based tools to support curriculum developers

    NARCIS (Netherlands)

    Nieveen, Nienke; Gustafson, Kent

    2000-01-01

    Since the start of the early 90’s, an increasing number of people are interested in supporting the complex tasks of the curriculum development process with computer-based tools. ‘Curriculum development’ refers to an intentional process or activity directed at (re) designing, developing and implement

  7. Analytical developments for definition and prediction of USB noise

    Science.gov (United States)

    Reddy, N. N.; Tam, C. K. W.

    1976-01-01

    A systematic acoustic data base and associated flow data are used in identifying the noise generating mechanisms of upper surface blown flap configurations of short takeoff and landing aircraft. Theory is developed for the radiated sound field of the highly sheared flow of the trailing edge wake. An empirical method is also developed using extensive experimental data and physical reasonings to predict the noise levels.

  8. Integration of Environmental Analytical Chemistry with Environmental Law: The Development of a Problem-Based Laboratory.

    Science.gov (United States)

    Cancilla, Devon A.

    2001-01-01

    Introduces an undergraduate level problem-based analytical chemistry laboratory course integrated with an environmental law course. Aims to develop an understanding among students on the use of environmental indicators for environmental evaluation. (Contains 30 references.) (YDS)

  9. Filmes de metal-hexacianoferrato: uma ferramenta em química analítica Metal-hexacyanoferrate films: a tool in analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Ivanildo Luiz de Mattos

    2001-04-01

    Full Text Available Chemically modified electrodes based on hexacyanometalate films are presented as a tool in analytical chemistry. Use of amperometric sensors and/or biosensors based on the metal-hexacyanoferrate films is a tendency. This article reviews some applications of these films for analytical determination of both inorganic (e.g. As3+, S2O3(2- and organic (e.g. cysteine, hydrazine, ascorbic acid, gluthatione, glucose, etc. compounds.

  10. Implementing WAI Authoring Tool Accessibility Guidelines in Developing Adaptive Elearning

    Directory of Open Access Journals (Sweden)

    Mahieddine Djoudi

    2012-09-01

    Full Text Available Adaptive learning technology allows for the development of more personalized online learning experiences with materials that adapt to student performance and skill level. The term “adaptive” is also used to describe Assistive Technologies that allow the usability of online based courses for learners with disabilities and special needs. Authoring tools can enable, encourage, and assist authors in the creation of elearning content. Because most of the content of the Web based adaptive learning is created using authoring tools, they may be accessible to authors regardless of disability and they may support and encourage the authors in creating accessible elearning content. This paper presents an authoring tool designed for developing accessible adaptive elearning. The authoring tool, dedicated to Algerian universities, is designed to satisfy the W3C/WAI Authoring Tool Accessibility Guidelines (ATAG, and to allow collaboration functionalities among teachers where building elearning courses. After presenting the W3C/WAI accessibility guidelines, the collaborative authoring tool is outlined.

  11. Work and Learner Identity -Developing an analytical framework

    DEFF Research Database (Denmark)

    Kondrup, Sissel

    The paper address the need to develop a theoretical framework able to grasp how engagement in work form certain conditions for workers to meet the obligation to form a pro-active learner identity, position themselves as educable subjects and engage in lifelong learning. An obligation that has bec...

  12. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  13. Development of a Safety Management Web Tool for Horse Stables.

    Science.gov (United States)

    Leppälä, Jarkko; Kolstrup, Christina Lunner; Pinzke, Stefan; Rautiainen, Risto; Saastamoinen, Markku; Särkijärvi, Susanna

    2015-01-01

    Managing a horse stable involves risks, which can have serious consequences for the stable, employees, clients, visitors and horses. Existing industrial or farm production risk management tools are not directly applicable to horse stables and they need to be adapted for use by managers of different types of stables. As a part of the InnoEquine project, an innovative web tool, InnoHorse, was developed to support horse stable managers in business, safety, pasture and manure management. A literature review, empirical horse stable case studies, expert panel workshops and stakeholder interviews were carried out to support the design. The InnoHorse web tool includes a safety section containing a horse stable safety map, stable safety checklists, and examples of good practices in stable safety, horse handling and rescue planning. This new horse stable safety management tool can also help in organizing work processes in horse stables in general.

  14. Development of a Safety Management Web Tool for Horse Stables

    Directory of Open Access Journals (Sweden)

    Jarkko Leppälä

    2015-11-01

    Full Text Available Managing a horse stable involves risks, which can have serious consequences for the stable, employees, clients, visitors and horses. Existing industrial or farm production risk management tools are not directly applicable to horse stables and they need to be adapted for use by managers of different types of stables. As a part of the InnoEquine project, an innovative web tool, InnoHorse, was developed to support horse stable managers in business, safety, pasture and manure management. A literature review, empirical horse stable case studies, expert panel workshops and stakeholder interviews were carried out to support the design. The InnoHorse web tool includes a safety section containing a horse stable safety map, stable safety checklists, and examples of good practices in stable safety, horse handling and rescue planning. This new horse stable safety management tool can also help in organizing work processes in horse stables in general.

  15. Analytical tools for assessing land degradation and its impact on soil quality

    Science.gov (United States)

    Bindraban, P. S.; Mantel, S.; Bai, Z.; de Jong, R.

    2010-05-01

    affects nutrient availability; in 20% of the potential maize growing areas productivity declined more then 50%. Overall, hydraulic soil functions were less affected by erosion in Kenya, still rain-fed yield decline exceeded 50 % on very steep lands. The simulated loss of topsoil in the Uruguay case mostly affected soil physical properties causing a reduction in rainfed wheat yields. Soil fertility status was little affected. In this paper we reflect on the use and effectiveness of these two approaches and discuss options for their (partial) integration as a means to better quantify extent, degree of degradation and the effects on soil quality. References Bai ZG, Dent DL, Olsson L and Schaepman ME 2008. Proxy global assessment of land degradation. Soil Use and Management 24, 223-234 Bindraban PS, Stoorvogel JJ, Jansen DM, Vlaming J and Groot JJR 2000. Land quality indicators for sustainable land management: proposed method for yield gap and soil nutrient balance. Agriculture, Ecosystems and the Environment 81, 103-112 Mantel S and van Engelen VWP 1999. Assessment of the impact of water erosion on productivity of maize in Kenya: an integrated modelling approach. Land Degradation & Development 10, 577-592 Mantel S, van Engelen VWP, Molfino JH and Resink JW 2000. Exploring biophysical potential and sustainability of wheat cultivation in Uruguay at the national level. Soil Use and Management 16, 270-278

  16. On the Development of Parameterized Linear Analytical Longitudinal Airship Models

    Science.gov (United States)

    Kulczycki, Eric A.; Johnson, Joseph R.; Bayard, David S.; Elfes, Alberto; Quadrelli, Marco B.

    2008-01-01

    In order to explore Titan, a moon of Saturn, airships must be able to traverse the atmosphere autonomously. To achieve this, an accurate model and accurate control of the vehicle must be developed so that it is understood how the airship will react to specific sets of control inputs. This paper explains how longitudinal aircraft stability derivatives can be used with airship parameters to create a linear model of the airship solely by combining geometric and aerodynamic airship data. This method does not require system identification of the vehicle. All of the required data can be derived from computational fluid dynamics and wind tunnel testing. This alternate method of developing dynamic airship models will reduce time and cost. Results are compared to other stable airship dynamic models to validate the methods. Future work will address a lateral airship model using the same methods.

  17. The simple analytics of oligopoly banking in developing economies

    OpenAIRE

    Khemraj, Tarron

    2010-01-01

    Previous studies have documented the tendency for the commercial banking sector of many developing economies to be highly liquid and be characterised by a persistently high interest rate spread. This paper embeds these stylised facts in an oligopoly model of the banking firm. The paper derives both the loan and deposit rates as a mark up rate over a relatively safe foreign interest rate. Then, using a diagrammatic framework, the paper provides an analysis of: (i) the distribution of financ...

  18. Using competences and competence tools in workforce development.

    Science.gov (United States)

    Green, Tess; Dickerson, Claire; Blass, Eddie

    The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs. PMID:21072016

  19. Human Rights as a Tool for Sustainable Development

    OpenAIRE

    Manuel Couret Branco; Pedro Damião Henriques

    2009-01-01

    In poor as much as in rich countries there is a fear that environmentally sustainable development might be contradictory to development in general and equitable development in particular. There could be indeed a contradiction between environmental and social sustainability, too much care for the environment eventually leading to forgetting about the people. The purpose of this paper is to explore institutional principles and tools that allow the conciliation between environmental and social s...

  20. Phonological assessment and analysis tools for Tagalog: Preliminary development.

    Science.gov (United States)

    Chen, Rachelle Kay; Bernhardt, B May; Stemberger, Joseph P

    2016-01-01

    Information and assessment tools concerning Tagalog phonological development are minimally available. The current study thus sets out to develop elicitation and analysis tools for Tagalog. A picture elicitation task was designed with a warm-up, screener and two extension lists, one with more complex and one with simpler words. A nonlinear phonological analysis form was adapted from English (Bernhardt & Stemberger, 2000) to capture key characteristics of Tagalog. The tools were piloted on a primarily Tagalog-speaking 4-year-old boy living in a Canadian-English-speaking environment. The data provided initial guidance for revision of the elicitation tool (available at phonodevelopment.sites.olt.ubc.ca). The analysis provides preliminary observations about possible expectations for primarily Tagalog-speaking 4-year-olds in English-speaking environments: Lack of mastery for tap/trill 'r', and minor mismatches for vowels, /l/, /h/ and word stress. Further research is required in order to develop the tool into a norm-referenced instrument for Tagalog in both monolingual and multilingual environments. PMID:27096390

  1. The South African dysphagia screening tool (SADS: A screening tool for a developing context

    Directory of Open Access Journals (Sweden)

    Calli Ostrofsky

    2016-02-01

    Full Text Available Background: Notwithstanding its value, there are challenges and limitations to implementing a dysphagia screening tool from a developed contexts in a developing context. The need for a reliable and valid screening tool for dysphagia that considers context, systemic rules and resources was identified to prevent further medical compromise, optimise dysphagia prognosis and ultimately hasten patients’ return to home or work.Methodology: To establish the validity and reliability of the South African dysphagia screening tool (SADS for acute stroke patients accessing government hospital services. The study was a quantitative, non-experimental, correlational cross-sectional design with a retrospective component. Convenient sampling was used to recruit 18 speech-language therapists and 63 acute stroke patients from three South African government hospitals. The SADS consists of 20 test items and was administered by speech-language therapists. Screening was followed by a diagnostic dysphagia assessment. The administrator of the tool was not involved in completing the diagnostic assessment, to eliminate bias and prevent contamination of results from screener to diagnostic assessment. Sensitivity, validity and efficacy of the screening tool were evaluated against the results of the diagnostic dysphagia assessment. Cohen’s kappa measures determined inter-rater agreement between the results of the SADS and the diagnostic assessment.Results and conclusion: The SADS was proven to be valid and reliable. Cohen’s kappa indicated a high inter-rater reliability and showed high sensitivity and adequate specificity in detecting dysphagia amongst acute stroke patients who were at risk for dysphagia. The SADS was characterised by concurrent, content and face validity. As a first step in establishing contextual appropriateness, the SADS is a valid and reliable screening tool that is sensitive in identifying stroke patients at risk for dysphagia within government

  2. The Development of a Tool for Sustainable Building Design:

    DEFF Research Database (Denmark)

    Tine Ring Hansen, Hanne; Knudstrup, Mary-Ann

    2009-01-01

    The understanding of sustainable building has changed over time along with the architectural interpretation of sustainability. The paper presents the results of a comparative analysis of the indicators found in different internationally acclaimed and Danish certification schemes and standards...... for sustainable buildings, as well as, an analysis of the relationship between the different approaches (e.g. low-energy, environmental, green building, solar architecture, bio-climatic architecture etc.) to sustainable building design and these indicators. The paper furthermore discusses how sustainable...... architecture will gain more focus in the coming years, thus, establishing the need for the development of a new tool and methodology, The paper furthermore describes the background and considerations involved in the development of a design support tool for sustainable building design. A tool which considers...

  3. An assessment tool for developing healthcare managerial skills and roles.

    Science.gov (United States)

    Guo, Kristina L

    2003-01-01

    This article is based on a study to identify, and by doing so help develop, the skills and roles of senior-level healthcare managers related to the needs of the current healthcare environment. To classify these roles and skills, a qualitative study was conducted to examine the literature on forces in the healthcare environment and their impact on managers. Ten senior managers were interviewed, revealing six roles as the most crucial to their positions along with the skills necessary to perform those roles. A pilot study was conducted with these senior managers to produce a final assessment tool. This assessment tool helps managers to identify strengths and weaknesses, develop in deficient areas, and promote competence in all areas as demanded by the market and organization. This tool can be used by organizations in the recruitment process and in the training process.

  4. Development of culturally sensitive dialog tools in diabetes education

    Directory of Open Access Journals (Sweden)

    Nana Folmann Hempler

    2015-01-01

    Full Text Available Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use.

  5. Development of a Framework for Sustainable Outsourcing: Analytic Balanced Scorecard Method (A-BSC)

    OpenAIRE

    Fabio De Felice; Antonella Petrillo; Claudio Autorino

    2015-01-01

    Nowadays, many enterprises choose to outsource its non-core business to other enterprises to reduce cost and increase the efficiency. Many enterprises choose to outsource their supply chain management (SCM) and leave it to a third-party organization in order to improve their services. The paper proposes an integrated and multicriteria tool useful to monitor and to improve performance in an outsourced supply chain. The Analytic Balanced Scorecard method (A-BSC) is proposed as an effective meth...

  6. Development of a green remediation tool in Japan.

    Science.gov (United States)

    Yasutaka, Tetsuo; Zhang, Hong; Murayama, Koki; Hama, Yoshihito; Tsukada, Yasuhisa; Furukawa, Yasuhide

    2016-09-01

    The green remediation assessment tool for Japan (GRATJ) presented in this study is a spreadsheet-based software package developed to facilitate comparisons of the environmental impacts associated with various countermeasures against contaminated soil in Japan. This tool uses a life-cycle assessment-based model to calculate inventory inputs/outputs throughout the activity life cycle during remediation. Processes of 14 remediation methods for heavy metal contamination and 12 for volatile organic compound contamination are built into the tool. This tool can evaluate 130 inventory inputs/outputs and easily integrate those inputs/outputs into 9 impact categories, 4 integrated endpoints, and 1 index. Comparative studies can be performed by entering basic data associated with a target site. The integrated results can be presented in a simpler and clearer manner than the results of an inventory analysis. As a case study, an arsenic-contaminated soil remediation site was examined using this tool. Results showed that the integrated environmental impacts were greater with onsite remediation methods than with offsite ones. Furthermore, the contributions of CO2 to global warming, SO2 to urban air pollution, and crude oil to resource consumption were greater than other inventory inputs/outputs. The GRATJ has the potential to improve green remediation and can serve as a valuable tool for decision makers and practitioners in selecting countermeasures in Japan. PMID:26803220

  7. Development of a green remediation tool in Japan.

    Science.gov (United States)

    Yasutaka, Tetsuo; Zhang, Hong; Murayama, Koki; Hama, Yoshihito; Tsukada, Yasuhisa; Furukawa, Yasuhide

    2016-09-01

    The green remediation assessment tool for Japan (GRATJ) presented in this study is a spreadsheet-based software package developed to facilitate comparisons of the environmental impacts associated with various countermeasures against contaminated soil in Japan. This tool uses a life-cycle assessment-based model to calculate inventory inputs/outputs throughout the activity life cycle during remediation. Processes of 14 remediation methods for heavy metal contamination and 12 for volatile organic compound contamination are built into the tool. This tool can evaluate 130 inventory inputs/outputs and easily integrate those inputs/outputs into 9 impact categories, 4 integrated endpoints, and 1 index. Comparative studies can be performed by entering basic data associated with a target site. The integrated results can be presented in a simpler and clearer manner than the results of an inventory analysis. As a case study, an arsenic-contaminated soil remediation site was examined using this tool. Results showed that the integrated environmental impacts were greater with onsite remediation methods than with offsite ones. Furthermore, the contributions of CO2 to global warming, SO2 to urban air pollution, and crude oil to resource consumption were greater than other inventory inputs/outputs. The GRATJ has the potential to improve green remediation and can serve as a valuable tool for decision makers and practitioners in selecting countermeasures in Japan.

  8. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining

    OpenAIRE

    Sivachenko Andrey Y; Huan Tianxiao; Harrison Scott H; Chen Jake Y

    2008-01-01

    Abstract Background New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of c...

  9. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    Science.gov (United States)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  10. Quality management in development of hard coatings on cutting tools

    Directory of Open Access Journals (Sweden)

    M. Soković

    2007-09-01

    Full Text Available Purpose: In this paper, an attempt is made to establish the general model of quality management also to the field of development and introducing of hard coatings on cutting tools.Design/methodology/approach: The conventional PVD and CVD methods have its limitations and that innovative processes are essential within the framework of an environmentally oriented quality management system. Meeting the requirements of ISO 9000 and ISO 14000 standards, the proposed model ensures the fulfilment of the basic requirements leading to the required quality of preparation processes and the quality of end products (hard coatings.Findings: One of the main pre-requisites for successful industrial production is the use of quality coated cutting tools with defined mechanical and technological properties. Therefore, for the development and introduction of new coated cutting tool (new combination of cutting material and hard coatings, it is necessary to carry out a number of studies with the purpose to optimize the coatings composition and processing procedures, and also to test new tools in working conditions.Research limitations/implications: The requirements from industry: produce faster, better, safety and more ecologically, force us to develop new effective tools and innovative technologies. This provides a technological challenge to the scientists and engineers and increases the importance of knowing several scientific disciplines.Practical implications: The quality of a company’s product directly affects its competitive position, profitability and credibility in the market. Quality management system must undergo a process of continuous improvement, which extends from the deployment of preventive quality assurance methods to the application of closed loop quality circuits.Originality/value: Design of the original and structured model of quality management system for successful development, producing and involving of new coated tools in the practice.

  11. Developing shape analysis tools to assist complex spatial decision making

    International Nuclear Information System (INIS)

    The objective of this research was to develop and implement a shape identification measure within a geographic information system, specifically one that incorporates analytical modeling for site location planning. The application that was developed incorporated a location model within a raster-based GIS, which helped address critical performance issues for the decision support system. Binary matrices, which approximate the object's geometrical form, are passed over the grided data structure and allow identification of irregular and regularly shaped objects. Lastly, the issue of shape rotation is addressed and is resolved by constructing unique matrices corresponding to the object's orientation

  12. Assessment Tool Development for Extracurricular Smet Programs for Girls

    Science.gov (United States)

    House, Jody; Johnson, Molly; Borthwick, Geoffrey

    Many different programs have been designed to increase girls' interest in and exposure to science, mathematics, engineering, and technology (SMET). Two of these programs are discussed and contrasted in the dimensions of length, level of science content, pedagogical approach, degree of self- vs. parent-selected participants, and amount of communitybuilding content. Two different evaluation tools were used. For one program, a modified version of the University of Pittsburgh's undergraduate engineering attitude assessment survey was used. Program participants' responses were compared to those from a fifth grade, mixed-sex science class. The only gender difference found was in the area of parental encouragement. The girls in the special class were more encouraged to participate in SMET areas. For the second program, a new age-appropriate tool developed specifically for these types of programs was used, and the tool itself was evaluated. The results indicate that the new tool has construct validity. On the basis of these preliminary results, a long-term plan for the continued development of the assessment tool is outlined.

  13. Review of the Development of Learning Analytics Applied in College-Level Institutes

    Directory of Open Access Journals (Sweden)

    Ken-Zen Chen

    2014-07-01

    Full Text Available This article focuses on the recent development of Learning Analytics using higher education institutional big-data. It addresses current state of Learning Analytics, creates a shared understanding, and clarifies misconceptions about the field. This article also reviews prominent examples from peer institutions that are conducting analytics, identifies their data and methodological framework, and comments on market vendors and non-for-profit initiatives. Finally, it suggests an implementation agenda for potential institutions and their stakeholders by drafting necessary preparations and creating iterative implementation flows.

  14. Developing mobile educational apps: development strategies, tools and business models

    Directory of Open Access Journals (Sweden)

    Serena Pastore

    Full Text Available The mobile world is a growing and evolving market in all its aspects from hardware, networks, operating systems and applications. Mobile applications or apps are becoming the new frontier of software development, since actual digital users use mobile devi ...

  15. Development of a New Measurement Tool for Individualism and Collectivism

    Science.gov (United States)

    Shulruf, Boaz; Hattie, John; Dixon, Robyn

    2007-01-01

    A new measurement tool for individualism and collectivism has been developed to address critical methodological issues in this field of social psychology. This new measure, the Auckland Individualism and Collectivism Scale (AICS), defines three dimensions of individualism: (a) responsibility (acknowledging one's responsibility for one's actions),…

  16. Educational Innovation with Learning Networks: some pertinent tools and developments

    NARCIS (Netherlands)

    Sloep, Peter; Berlanga, Adriana; Greller, Wolfgang; Stoyanov, Slavi; Retalis, Symeon; Van der Klink, Marcel; Hensgens, Jan

    2011-01-01

    Sloep, P. B., Berlanga, A. J., Greller, W., Stoyanov, S., Retalis, S., Van der Klink, M. et al. (2011). Educational Innovation with Learning Networks: some pertinent tools and developments. Paper presented at the 2nd International Conference on Technology Enhanced Learning, Quality of Teaching and R

  17. Method and Tools for Development of Advanced Instructional Systems

    NARCIS (Netherlands)

    Arend, J. van der; Riemersma, J.B.J.

    1994-01-01

    The application of advanced instructional systems (AISs), like computer-based training systems, intelligent tutoring systems and training simulators, is widely spread within the Royal Netherlands Army. As a consequence there is a growing interest in methods and tools to develop effective and efficie

  18. DEVELOPING A TOOL FOR ENVIRONMENTALLY PREFERABLE PURCHASING: JOURNAL ARTICLE

    Science.gov (United States)

    NRMRL-CIN-1246 Curran*, M.A. Developing a Tool for Environmentally Preferable Purchasing. Environmental Management and Health (Filho, W.L. (Ed.), MCB University Press) 12 (3):244-253 (2001). EPA/600/J-02/238, http://www.emerald-library.com/ft. 12/04/2000 LCA-based guidance wa...

  19. Developing a Decision Support System: The Software and Hardware Tools.

    Science.gov (United States)

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  20. iMarine - Applications and tools development plan

    OpenAIRE

    Ellenbroek, Anton; Candela, Leonardo

    2012-01-01

    This report documents the strategy and plan leading to the development of specific applications and tools that in tandem with the rest of gCube technology will be used to realize the Virtual Research Environments that are expected to serve the needs of the Ecosystem Approach Community of Practice.

  1. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  2. ENVIRONMENTAL ACCOUNTING: A MANAGEMENT TOOL FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Nicolae Virag; Dorel Mates; Doru Ioan Ardelean; Claudiu Gheorghe Feies

    2014-01-01

    The paper aims to analyze the ways in which accounting as a social science and management information tool can contribute to sustainable development. The paper highlights the emergence of the environmental accounting concept, the applicability of the environmental accounting, types of environmental accounting, scope and benefits of environmental accounting.

  3. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  4. Solid-phase development of a L-hydroxybenzotriazole linker for heterocycle synthesis using analytical constructs.

    Science.gov (United States)

    Scicinski, J J; Congreve, M S; Jamieson, C; Ley, S V; Newman, E S; Vinader, V M; Carr, R A

    2001-01-01

    The development of a 1-hydroxybenzotriazole linker for the synthesis of heterocyclic derivatives is described, utilizing analytical construct methodology to facilitate the analysis of resin samples. A UV-chromophore-containing analytical construct enabled the accurate determination of resin loading and the automated monitoring of key reactions using only small quantities of resin. The syntheses of an array of isoxazole derivatives are reported. PMID:11442396

  5. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  6. Tool for test driven development of JavaScript applications

    OpenAIRE

    Stamać, Gregor

    2015-01-01

    Thesis describes the implementation of a tool for testing JavaScript code. The tool is designed to help us in test-driven development of JavaScript-based applications. Therefore, it is important to display test results as quickly as possible. The thesis is divided into four parts. First part describes JavaScript environment. It contains a brief history of the JavaScript language, prevalence, strengths and weaknesses. This section also describes TypeScript programming language that is a super...

  7. Designing the user experience of game development tools

    CERN Document Server

    Lightbown, David

    2015-01-01

    The Big Green Button My Story Who Should Read this Book? Companion Website and Twitter Account Before we BeginWelcome to Designing the User Experience of Game Development ToolsWhat Will We Learn in This Chapter?What Is This Book About?Defining User ExperienceThe Value of Improving the User Experience of Our ToolsParallels Between User Experience and Game DesignHow Do People Benefit From an Improved User Experience?Finding the Right BalanceWrapping UpThe User-Centered Design ProcessWhat Will We

  8. Developing a financial simulation tool as a web application

    OpenAIRE

    Neupane, Suraj

    2015-01-01

    “Kunnan Taitoa Oy”, a Finnish municipal financial expert, commissioned to upgrade its financial simulation tool from its current spreadsheet status to a web application. The principles of Open source served as the foundation of software development for a team of Haaga-Helia students who participated in the project ‘Taitoa’. The project aimed to deliver the working version of the web application. This thesis documents the process of application development and the thesis itself is a project-b...

  9. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    Energy Technology Data Exchange (ETDEWEB)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg

  10. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine.

  11. Development of Interpretive Simulation Tool for the Proton Radiography Technique

    CERN Document Server

    Levy, M C; Wilks, S C; Ross, J S; Huntington, C M; Fiuza, F; Baring, M G; Park, H- S

    2014-01-01

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from PIC or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field `primitives' is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagneti...

  12. WP3 Prototype development for operational planning tool

    DEFF Research Database (Denmark)

    Kristoffersen, Trine; Meibom, Peter; Apfelbeck, J.;

    of electricity load and wind power production, and to cover forced outages of power plants and transmission lines. Work has been carried out to include load uncertainty and forced outages in the two main components of the Wilmar Planning tool namely the Scenario Tree Tool and the Joint Market Model. This work...... is documented in chapter 1 and 2. The inclusion of load uncertainty and forced outages in the Scenario Tree Tool enables calculation of the demand for reserve power depending on the forecast horizon. The algorithm is given in Section 3.1. The design of a modified version of the Joint Market Model enabling......This report documents the model development carried out in work package 3 in the SUPWIND project. It was decided to focus on the estimation of the need for reserve power, and on the reservation of reserve power by TSOs. Reserve power is needed to cover deviations from the day-ahead forecasts...

  13. National Energy Audit Tool for Multifamily Buildings Development Plan

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, Mini [ORNL; MacDonald, Michael [Sentech, Inc.; Accawi, Gina K [ORNL; New, Joshua Ryan [ORNL; Im, Piljae [ORNL

    2012-03-01

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherization of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional

  14. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  15. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    Science.gov (United States)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  16. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    International Nuclear Information System (INIS)

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY)

  17. Microsystem design framework based on tool adaptations and library developments

    Science.gov (United States)

    Karam, Jean Michel; Courtois, Bernard; Rencz, Marta; Poppe, Andras; Szekely, Vladimir

    1996-09-01

    Besides foundry facilities, Computer-Aided Design (CAD) tools are also required to move microsystems from research prototypes to an industrial market. This paper describes a Computer-Aided-Design Framework for microsystems, based on selected existing software packages adapted and extended for microsystem technology, assembled with libraries where models are available in the form of standard cells described at different levels (symbolic, system/behavioral, layout). In microelectronics, CAD has already attained a highly sophisticated and professional level, where complete fabrication sequences are simulated and the device and system operation is completely tested before manufacturing. In comparison, the art of microsystem design and modelling is still in its infancy. However, at least for the numerical simulation of the operation of single microsystem components, such as mechanical resonators, thermo-elements, elastic diaphragms, reliable simulation tools are available. For the different engineering disciplines (like electronics, mechanics, optics, etc) a lot of CAD-tools for the design, simulation and verification of specific devices are available, but there is no CAD-environment within which we could perform a (micro-)system simulation due to the different nature of the devices. In general there are two different approaches to overcome this limitation: the first possibility would be to develop a new framework tailored for microsystem-engineering. The second approach, much more realistic, would be to use the existing CAD-tools which contain the most promising features, and to extend these tools so that they can be used for the simulation and verification of microsystems and of the devices involved. These tools are assembled with libraries in a microsystem design environment allowing a continuous design flow. The approach is driven by the wish to make microsystems accessible to a large community of people, including SMEs and non-specialized academic institutions.

  18. Development of an Analytical System for Determination of Free Acid via a Joint Method Combining Density and Conductivity Measurement

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Determination of free acid plays an important role in spent nuclear fuel reprocessing. It is necessary to develop a rapid analytical device and method for measuring free acid. A novel analytical system and method was studied to monitor the acidity

  19. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    Science.gov (United States)

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS. PMID:27387996

  20. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  1. Searching for Sentient Design Tools for Game Development

    DEFF Research Database (Denmark)

    Liapis, Antonios

    Over the last twenty years, computer games have grown from a niche market targeting young adults to an important player in the global economy, engaging millions of people from different cultural backgrounds. As both the number and the size of computer games continue to rise, game companies handle...... increasing demand by expanding their cadre, compressing development cycles and reusing code or assets. To limit development time and reduce the cost of content creation, commercial game engines and procedural content generation are popular shortcuts. Content creation tools are means to either generate......'s own preferences. While the thesis focuses on the design, performance, and human use of Sentient Sketchbook, the same algorithms and concepts can be applied to different mixed-initiative tools, a subset of which has been implemented and is presented in this thesis....

  2. MOOCs as a Professional Development Tool for Librarians

    OpenAIRE

    Meghan Ecclestone

    2013-01-01

    This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  3. Facilitating management learning: Developing critical reflection through reflective tools

    OpenAIRE

    Gray, David E

    2007-01-01

    The aim of this article is to explore how the practice of critical reflection within a management learning process can be facilitated through the application of reflective processes and tools. A distinction is drawn between reflection as a form of individual development (of, say, the reflective practitioner), and critical reflection as a route to collective action and a component of organizational learning and change. Critical reflection, however, is not a process that comes naturally to many...

  4. MOOCs as a Professional Development Tool for Librarians

    Directory of Open Access Journals (Sweden)

    Meghan Ecclestone

    2013-11-01

    Full Text Available This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  5. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  6. Collaborative socioeconomic tool development to address management and planning needs

    Science.gov (United States)

    Richardson, Leslie A.; Huber, Christopher; Cullinane Thomas, Catherine; Donovan, Elizabeth; Koontz, Lynne M.

    2014-01-01

    Public lands and resources managed by the National Park Service (NPS) and other land management agencies provide a wide range of social and economic benefits to both nearby local communities and society as a whole, ranging from job creation, to access to unique recreational opportunities, to subsistence and tribal uses of the land. Over the years, there has been an increased need to identify and analyze the socioeconomic effects of the public’s use of NPS lands and resources, and the wide range of NPS land management decisions. This need stems from laws such as the National Environmental Policy Act (NEPA), increased litigation and appeals on NPS management decisions, as well as an overall need to demonstrate how parks benefit communities and the American public. To address these needs, the U.S. Geological Survey (USGS) and NPS have an ongoing partnership to collaboratively develop socioeconomic tools to support planning needs and resource management. This article discusses two such tools. The first, Assessing Socioeconomic Planning Needs (ASPN), was developed to help NPS planners and managers identify key social and economic issues that can arise as a result of land management actions. The second tool, the Visitor Spending Effects (VSE) model, provides a specific example of a type of analysis that may be recommended by ASPN. The remainder of this article discusses the development, main features, and plans for future versions and applications of both ASPN and the VSE.

  7. Support Tools in Formulation Development for Poorly Soluble Drugs.

    Science.gov (United States)

    Fridgeirsdottir, Gudrun A; Harris, Robert; Fischer, Peter M; Roberts, Clive J

    2016-08-01

    The need for solubility enhancement through formulation is a well-known but still problematic issue because of the numbers of poorly water-soluble drugs in development. There are several possible routes that can be taken to increase the bioavailability of drugs intended for immediate-release oral formulation. The best formulation strategy for any given drug will depend on numerous factors, including required dose, shelf life, manufacturability, and the properties of the active pharmaceutical ingredient (API). Choosing an optimal formulation and manufacturing route for a new API is therefore not a straightforward process. Currently, there are several approaches that are used in the pharmaceutical industry to select the best formulation strategy. These differ in complexity and efficiency, but most try to predict which route will best suit the API based on selected molecular parameters such as molecular weight, lipophilicity (logP), and solubility. These methods range from using no tools, trial and error methods through a variety of complex tools from small in vitro or in vivo experiments or high throughput screening, guidance maps, and decision trees to the most complex methods based on computational modelling tools. This review aims to list available support tools and explain how they are used. PMID:27368122

  8. Feasibility assessment tool for urban anaerobic digestion in developing countries.

    Science.gov (United States)

    Lohri, Christian Riuji; Rodić, Ljiljana; Zurbrügg, Christian

    2013-09-15

    This paper describes a method developed to support feasibility assessments of urban anaerobic digestion (AD). The method not only uses technical assessment criteria but takes a broader sustainability perspective and integrates technical-operational, environmental, financial-economic, socio-cultural, institutional, policy and legal criteria into the assessment tool developed. Use of the tool can support decision-makers with selecting the most suitable set-up for the given context. The tool consists of a comprehensive set of questions, structured along four distinct yet interrelated dimensions of sustainability factors, which all influence the success of any urban AD project. Each dimension answers a specific question: I) WHY? What are the driving forces and motivations behind the initiation of the AD project? II) WHO? Who are the stakeholders and what are their roles, power, interests and means of intervention? III) WHAT? What are the physical components of the proposed AD chain and the respective mass and resource flows? IV) HOW? What are the key features of the enabling or disabling environment (sustainability aspects) affecting the proposed AD system? Disruptive conditions within these four dimensions are detected. Multi Criteria Decision Analysis is used to guide the process of translating the answers from six sustainability categories into scores, combining them with the relative importance (weights) attributed by the stakeholders. Risk assessment further evaluates the probability that certain aspects develop differently than originally planned and assesses the data reliability (uncertainty factors). The use of the tool is demonstrated with its application in a case study for Bahir Dar in Ethiopia. PMID:23722149

  9. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...... that compute appearance values, access records from the database, etc. This is a new way of development different from programming. So the designer does not program an object-relational mapping layer, which requires in-depth knowledge about programming and database. He directly maps relational data to user...

  10. Developing Tools and Technologies to Meet MSR Planetary Protection Requirements

    Science.gov (United States)

    Lin, Ying

    2013-01-01

    This paper describes the tools and technologies that need to be developed for a Caching Rover mission in order to meet the overall Planetary Protection requirements for future Mars Sample Return (MSR) campaign. This is the result of an eight-month study sponsored by the Mars Exploration Program Office. The goal of this study is to provide a future MSR project with a focused technology development plan for achieving the necessary planetary protection and sample integrity capabilities for a Mars Caching Rover mission.

  11. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function.

  12. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    Science.gov (United States)

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and

  13. Development of a burn prevention teaching tool for Amish children.

    Science.gov (United States)

    Rieman, Mary T; Kagan, Richard J

    2012-01-01

    Although there are inherent risks for burn injury associated with the Amish lifestyle, burn prevention is not taught in Amish schools. The purpose of this study was to develop a burn prevention teaching tool for Amish children. An anonymous parental survey was designed to explore the content and acceptability of a teaching tool within an Old Order Amish community. After institutional review board approval, the Amish teacher distributed surveys to 16 families of the 30 children attending the one-room school. Fourteen (88%) of the families responded to identify these burn risks in and around their homes, barns, and shops: lighters, wood and coal stoves, kerosene heaters, gasoline-powered engines, and hot liquids used for canning, butchering, mopping, washing clothes, and making lye soap. All respondents were in favor of teaching familiar safety precautions, fire escape plans, burn first aid, and emergency care to the children. There was some minor objection to more modern devices such as bath tub thermometers (25%), fire extinguishers (19%), and smoke detectors (6%). The teacher was interested in a magnetic teaching board depicting Amish children and typical objects in their home environment. Movable pieces could afford the opportunity to identify hazards and to rearrange them for a safer situation. This survey served to introduce burn prevention to one Amish community and to develop an appropriate teaching tool for the school. It is anticipated that community participation would support its acceptance and eventual utilization within this tenaciously traditional culture.

  14. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  15. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  16. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  17. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  18. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  19. Web-based information systems development and dynamic organisational change: the need for emergent development tools

    OpenAIRE

    Ramrattan, M; Patel, NV

    2009-01-01

    This paper considers contextual issues relating to the problem of developing web-based information systems in and for emergent organisations. It postulates that the methods available suffer because of sudden and unexpected changing characteristics within the organisation. The Theory of Deferred Action is used as the basis for the development of an emergent development tool. Many tools for managing change in a continuously changing organisation are susceptible to inadequacy. The in...

  20. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    Science.gov (United States)

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In

  1. Demonstration of Decision Support Tools for Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Shropshire, David Earl; Jacobson, Jacob Jordan; Berrett, Sharon; Cobb, D. A.; Worhach, P.

    2000-11-01

    The Demonstration of Decision Support Tools for Sustainable Development project integrated the Bechtel/Nexant Industrial Materials Exchange Planner and the Idaho National Engineering and Environmental Laboratory System Dynamic models, demonstrating their capabilities on alternative fuel applications in the Greater Yellowstone-Teton Park system. The combined model, called the Dynamic Industrial Material Exchange, was used on selected test cases in the Greater Yellow Teton Parks region to evaluate economic, environmental, and social implications of alternative fuel applications, and identifying primary and secondary industries. The test cases included looking at compressed natural gas applications in Teton National Park and Jackson, Wyoming, and studying ethanol use in Yellowstone National Park and gateway cities in Montana. With further development, the system could be used to assist decision-makers (local government, planners, vehicle purchasers, and fuel suppliers) in selecting alternative fuels, vehicles, and developing AF infrastructures. The system could become a regional AF market assessment tool that could help decision-makers understand the behavior of the AF market and conditions in which the market would grow. Based on this high level market assessment, investors and decision-makers would become more knowledgeable of the AF market opportunity before developing detailed plans and preparing financial analysis.

  2. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  3. Evaluation of Cross-Platform Mobile Development ToolsDevelopment of an Evaluation Framework

    OpenAIRE

    Öberg, Linus

    2016-01-01

    The aim of this thesis is to determine what cross-platform mobile development tool that is best suited for Vitec and their mobile application ”Teknisk Förvaltning”. But more importantly we will in this thesis develop a generic evaluation framework for assessing cross-platform mobile development tools. With the purpose of making it easy to select the most appropriate tool for a specific mobile application. This was achieved by first, in consideration with Vitec, selecting Cordova + Ionic and X...

  4. Combining Multiple Measures of Students' Opportunities to Develop Analytic, Text-Based Writing Skills

    Science.gov (United States)

    Correnti, Richard; Matsumura, Lindsay Clare; Hamilton, Laura S.; Wang, Elaine

    2012-01-01

    Guided by evidence that teachers contribute to student achievement outcomes, researchers have been reexamining how to study instruction and the classroom opportunities teachers create for students. We describe our experience measuring students' opportunities to develop analytic, text-based writing skills. Utilizing multiple methods of data…

  5. Analytical approach to developing the transport threshold models of neoclassical tearing modes in tokamaks

    International Nuclear Information System (INIS)

    Analytical solutions of the stationary conduction equation are obtained. The solutions are used for developing the transport threshold models (TTM) of the neoclassical tearing modes (NTM) in tokamaks. The following TTM are considered: collisional, convective, inertial and rotational. These TTM may be the fragments of the more general models of NTM

  6. Development of an Analytical System for Rapid, Remote Determining Concentration and Valence of Uranium and Plutonium

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Concentrations and valence of U and Pu directly shows whether the Purex process is under normal conditions or not. It is necessary to monitor concentrations and valence of U and Pu in real-time.Purposes of this work is to develop an analytical

  7. Development of Distributed Cache Strategy for Analytic Cluster in an Internet of Things System

    OpenAIRE

    Yang ZHOU

    2016-01-01

    This thesis discusses the development of a distributed cache strategy for an analyt-ic cluster in an IoT system. In this thesis, LRU and Proactive Cache and essential distributed system related concepts are discussed. The study about the approaches for performance optimization, nodes and data distributing in the IoT system are also included. In the IoT system, the cluster for data analysis involves large volume of data and some specific processes such as streaming processing raises a need ...

  8. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    Energy Technology Data Exchange (ETDEWEB)

    Banning, Helena

    2016-03-12

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  9. Analytical quality assurance procedures developed for the IAEA's Reference Asian Man Project (Phase 2)

    International Nuclear Information System (INIS)

    Analytical quality assurance procedures adopted for use in the IAEA Co-ordinated Research Project on Ingestion and Organ Content of Trace Elements of Importance in Radiological Protection are designed to ensure comparability of the analytical results for Cs, I, Sr, Th, U and other elements in human tissues and diets collected and analysed in nine participating countries. The main analytical techniques are NAA and ICP-MS. For sample preparation, all participants are using identical food blenders which have been centrally supplied after testing for contamination. For quality control of the analyses, six NIST SRMs covering a range of matrices with certified and reference values for the elements of interest have been distributed. A new Japanese reference diet material has also been developed. These quality assurance procedures are summarized here and new data are presented for Cs, I, Sr, Th and U in the NIST SRMs. (author)

  10. An Analytic Approach to Developing Transport Threshold Models of Neoclassical Tearing Modes in Tokamaks

    International Nuclear Information System (INIS)

    Transport threshold models of neoclassical tearing modes in tokamaks are investigated analytically. An analysis is made of the competition between strong transverse heat transport, on the one hand, and longitudinal heat transport, longitudinal heat convection, longitudinal inertial transport, and rotational transport, on the other hand, which leads to the establishment of the perturbed temperature profile in magnetic islands. It is shown that, in all these cases, the temperature profile can be found analytically by using rigorous solutions to the heat conduction equation in the near and far regions of a chain of magnetic islands and then by matching these solutions. Analytic expressions for the temperature profile are used to calculate the contribution of the bootstrap current to the generalized Rutherford equation for the island width evolution with the aim of constructing particular transport threshold models of neoclassical tearing modes. Four transport threshold models, differing in the underlying competing mechanisms, are analyzed: collisional, convective, inertial, and rotational models. The collisional model constructed analytically is shown to coincide exactly with that calculated numerically; the reason is that the analytical temperature profile turns out to be the same as the numerical profile. The results obtained can be useful in developing the next generation of general threshold models. The first steps toward such models have already been made

  11. NEEMO 20: Science Training, Operations, and Tool Development

    Science.gov (United States)

    Graff, T.; Miller, M.; Rodriguez-Lanetty, M.; Chappell, S.; Naids, A.; Hood, A.; Coan, D.; Abell, P.; Reagan, M.; Janoiko, B.

    2016-01-01

    The 20th mission of the National Aeronautics and Space Administration (NASA) Extreme Environment Mission Operations (NEEMO) was a highly integrated evaluation of operational protocols and tools designed to enable future exploration beyond low-Earth orbit. NEEMO 20 was conducted from the Aquarius habitat off the coast of Key Largo, FL in July 2015. The habitat and its surroundings provide a convincing analog for space exploration. A crew of six (comprised of astronauts, engineers, and habitat technicians) lived and worked in and around the unique underwater laboratory over a mission duration of 14-days. Incorporated into NEEMO 20 was a diverse Science Team (ST) comprised of geoscientists from the Astromaterials Research and Exploration Science (ARES/XI) Division from the Johnson Space Center (JSC), as well as marine scientists from the Department of Biological Sciences at Florida International University (FIU). This team trained the crew on the science to be conducted, defined sampling techniques and operational procedures, and planned and coordinated the science focused Extra Vehicular Activities (EVAs). The primary science objectives of NEEMO 20 was to study planetary sampling techniques and tools in partial gravity environments under realistic mission communication time delays and operational pressures. To facilitate these objectives two types of science sites were employed 1) geoscience sites with available rocks and regolith for testing sampling procedures and tools and, 2) marine science sites dedicated to specific research focused on assessing the photosynthetic capability of corals and their genetic connectivity between deep and shallow reefs. These marine sites and associated research objectives included deployment of handheld instrumentation, context descriptions, imaging, and sampling; thus acted as a suitable proxy for planetary surface exploration activities. This abstract briefly summarizes the scientific training, scientific operations, and tool

  12. Development and Evaluation of a Riparian Buffer Mapping Tool

    Science.gov (United States)

    Milheim, Lesley E.; Claggett, Peter R.

    2008-01-01

    Land use and land cover within riparian areas greatly affect the conditions of adjacent water features. In particular, riparian forests provide many environmental benefits, including nutrient uptake, bank stabilization, steam shading, sediment trapping, aquatic and terrestrial habitat, and stream organic matter. In contrast, residential and commercial development and associated transportation infrastructure increase pollutant and nutrient loading and change the hydrologic characteristics of the landscape, thereby affecting both water quality and habitat. Restoring riparian areas is a popular and cost effective restoration technique to improve and protect water quality. Recognizing this, the Chesapeake Executive Council committed to restoring 10,000 miles of riparian forest buffers throughout the Chesapeake Bay watershed by the year 2010. In 2006, the Chesapeake Executive Council further committed to 'using the best available...tools to identify areas where retention and expansion of forests is most needed to protect water quality'. The Chesapeake Bay watershed encompasses 64,000 square miles, including portions of six States and Washington, D.C. Therefore, the interpretation of remotely sensed imagery provides the only effective technique for comprehensively evaluating riparian forest protection and restoration opportunities throughout the watershed. Although 30-meter-resolution land use and land cover data have proved useful on a regional scale, they have not been equally successful at providing the detail required for local-scale assessment of riparian area characteristics. Use of high-resolution imagery (HRI) provides sufficient detail for local-scale assessments, although at greater cost owing to the cost of the imagery and the skill and time required to process the data. To facilitate the use of HRI for monitoring the extent of riparian forest buffers, the U.S. Forest Service and the U.S. Geological Survey Eastern Geographic Science Center funded the

  13. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    Directory of Open Access Journals (Sweden)

    Carmen – Elena DOBROTĂ

    2015-12-01

    Full Text Available Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidate a performing public administration it was a challenging task, taking into account the types of interventions within Operational Programme Administrative Capacity Development 2007 – 2013 and the continuous changes in the economic and social environment in Romania and Europe. The aim of this article is to provide a short description of the approach used by the Managing Authority for OPACD within the performance management of the structural funds in Romania between 2008 and 2014. The paper offers a broad image of the way in which evaluations (ad-hoc, intermediate and performance were used in different stages of OP implementation as a tool of management.

  14. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen;

    GPULab - A competence center and laboratory for research and collaboration within academia and partners in industry has been established in 2008 at section for Scientific Computing, DTU informatics, Technical University of Denmark. In GPULab we focus on the utilization of Graphics Processing Units...... (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  15. Development of a biogas planning tool for project owners

    DEFF Research Database (Denmark)

    Fredenslund, Anders Michael; Kjær, Tyge

    are considered: Combined heat and power and natural gas grid injection. The main input to the model is the amount and types of substrates available for anaerobic digestion. By substituting the models’ default values with more project specific information, the model can be used in a biogas projects later phases......A spreadsheet model was developed, which can be used as a tool in the initial phases of planning a centralized biogas plant in Denmark. The model assesses energy production, total plant costs, operational costs and revenues and effect on greenhouse gas emissions. Two energy utilization alternatives...

  16. Recent developments in analytical techniques for characterization of ultra pure materials—An overview

    Indian Academy of Sciences (India)

    V Balaram

    2005-07-01

    With continual decrease of geometries used in modern IC devices, the trace metal impurities of process materials and chemicals used in their manufacture are moving to increasingly lower levels, i.e. ng/g and pg/g levels. An attempt is made to give a brief overview of the use of different analytical techniques in the analysis of trace metal impurities in ultrapure materials, such as, high-purity tellurium (7N), high purity quartz, high-purity copper (6N), and high purity water and mineral acids. In recent times mass spectrometric techniques such as ICP–MS, GD–MS and HR–ICP–MS with their characteristic high sensitivity and less interference effects were proved to be extremely useful in this field. A few examples of such application studies using these techniques are outlined. The usefulness of other analytical techniques such as F–AAS, GF–AAS, XRF, ICP–AES and INAA was also described. Specific advantages of ICP–MS and HR–ICP–MS such as high sensitivity, limited interference effects, element coverage and speed would make them powerful analytical tools for the characterization of ultrapure materials in future.

  17. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...

  18. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry

    OpenAIRE

    Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I.; Nitz, Mark; Mitchell A. Winnik

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for...

  19. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    Science.gov (United States)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  20. Analytical approaches for assaying metallodrugs in biological samples: recent methodological developments and future trends.

    Science.gov (United States)

    Timerbaev, Andrei; Sturup, Stefan

    2012-03-01

    Contemporary medicine increasingly relies on metal-based drugs and correspondingly growing in importance is the monitoring of the drugs and their metabolites in biological samples. Over the last decade, a range of analytical techniques have been developed in order to improve administration strategies for clinically approved compounds and understand pharmacokinetics, pharmacodynamics, and metabolism of new drugs so as ultimately to make their clinical development more effective. This paper gives an overview of various separation and detection methods, as well as common sample preparation strategies, currently in use to achieve the intended goals. The critical discussion of existing analytical technologies encompasses notably their detection capability, ability to handle biological matrices with minimum pretreatment, sample throughput, and cost efficiency. The main attention is devoted to those applications that are progressed to real-world biosamples and selected examples are given to illustrate the overall performance and applicability of advanced analytical systems. Also emphasized is the emerging role of inductively coupled plasma mass spectrometry (ICP-MS), both as a standalone instrument (for determination of metals originating from drug compounds) and as an element-specific detector in combinations with liquid chromatography or capillary electrophoresis (for drug metabolism studies). An increasing number of academic laboratories are using ICP-MS technology today, and this review will focus on the analytical possibilities of ICP-MS which would before long provide the method with the greatest impact on the clinical laboratory. PMID:21838702

  1. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  2. TENTube: A Video-based Connection Tool Supporting Competence Development

    Directory of Open Access Journals (Sweden)

    Albert A Angehrn

    2008-07-01

    Full Text Available The vast majority of knowledge management initiatives fail because they do not take sufficiently into account the emotional, psychological and social needs of individuals. Only if users see real value for themselves will they actively use and contribute their own knowledge to the system, and engage with other users. Connection dynamics can make this easier, and even enjoyable, by connecting people and bringing them closer through shared experiences such as playing a game together. A higher connectedness of people to other people, and to relevant knowledge assets, will motivate them to participate more actively and increase system usage. In this paper, we describe the design of TENTube, a video-based connection tool we are developing to support competence development. TENTube integrates rich profiling and network visualization and navigation with agent-enhanced game-like connection dynamics.

  3. Tools for tracking progress. Indicators for sustainable energy development

    International Nuclear Information System (INIS)

    A project on 'Indicators for Sustainable Energy Development (ISED)' was introduced by the IAEA as a part of its work programme on Comparative Assessment of Energy Sources for the biennium 1999-2000. It is being pursued by the Planning and Economic Studies Section of the Department of Nuclear Energy. The envisaged tasks are to: (1) identify the main components of sustainable energy development and derive a consistent set of appropriate indicators, keeping in view the indicators for Agenda 21, (2) establish relationship of ISED with those of the Agenda 21, and (3) review the Agency's databases and tools to determine the modifications required to apply the ISED. The first two tasks are being pursued with the help of experts from various international organizations and Member States. In this connection two expert group meetings were held, one in May 1999 and the other in November 1999. The following nine topics were identified as the key issues: social development; economic development; environmental congeniality and waste management; resource depletion; adequate provision of energy and disparities; energy efficiency; energy security; energy supply options; and energy pricing. A new conceptual framework model specifically tuned to the energy sector was developed, drawing upon work by other organizations in the environmental area. Within the framework of this conceptual model, two provisional lists of ISED - a full list and a core list - have been prepared. They cover indicators for the following energy related themes and sub-themes under the economic, social and environmental dimensions of sustainable energy development: Economic dimension: Economic activity levels; End-use energy intensities of selected sectors and different manufacturing industries; energy supply efficiency; energy security; and energy pricing. Social dimension: Energy accessibility and disparities. Environmental dimension: Air pollution (urban air quality; global climate change concern); water

  4. On the development of the METAR family of inspection tools

    International Nuclear Information System (INIS)

    Since 1998, Hydro Quebec Research Centre (IREQ), in collaboration with Gentilly-2, has been working on the development of inspection devices for the feeder tubes of CANDU power plants. The first tool to come out of this work was the Metar bracelet, now used throughout the CANDU utilities, consisting of 14 ultrasonic probes held in place in a rigid bracelet to measure the thickness of the pipes and moved around manually along the pipe. Following the success of the Metar, a motorized version, i.e. the Crawler, has been developed to inspect beyond the operator arm's reach to access hard to reach place or further down the pipes in the reactor. This new system has been tested at 3 different stations and will be commercially available soon. Finally, the same technology was used to develop a motorized 2-axis crack detection device to answer new concerns about the feeder. Other configurations, depending on the demands from the industry, could also be developed for specific inspection needs, for example; inspection of the graylock welds, 360o inspection of feeders, or multitasking inspection on a single frame, etc. Most of the designs shown in this article have been or will be patented and are, or will be, licensed to a partner company to make them commercially available to the industry. This paper gives a brief history of the project and a description of the technologies developed in the last 5 years concerning feeder inspection. (author)

  5. Development of Nylon Based FDM Filament for Rapid Tooling Application

    Science.gov (United States)

    Singh, R.; Singh, S.

    2014-04-01

    There has been critical need for development of cost effective nylon based wire to be used as feed stock filament for fused deposition modelling (FDM) machine. But hitherto, very less work has been reported for development of alternate solution of acrylonitrile butadiene styrene (ABS) based wire which is presently used in most of FDM machines. The present research work is focused on development of nylon based wire as an alternative of ABS wire (which is to be used as feedstock filament on FDM) without changing any hardware or software of machine. For the present study aluminium oxide (Al2O3) as additive in different proportion has been used with nylon fibre. Single screw extruder was used for wire preparation and wire thus produced was tested on FDM. Mechanical properties i.e. tensile strength and percentage elongation of finally developed wire have been optimized by Taguchi L9 technique. The work represented major development in reducing cost and time in rapid tooling applications.

  6. Simulation of Territorial Development Based on Fiscal Policy Tools

    Directory of Open Access Journals (Sweden)

    Robert Brumnik

    2014-01-01

    Full Text Available Modern approaches to the development of a national economy are often characterized with an imbalanced inflation of some economic branches leading to a disproportional socioeconomic territories development (SETD. Such disproportions, together with other similar factors, frequently result in a lack of economic integrity, various regional crises, and a low rate of the economic and territorial growth. Those disproportions may also conduce to an inadequate degree of the interregional collaboration. This paper proposes the ways of regulating imbalances in the territorial development based upon the fiscal policy tools. The latter can immediately reduce the amplitude of economic cycle fluctuations and provide for a stable development of the economic state system. The same approach is applied to control the processes of transformation of the tax legislation and tax relations, as well as the levying and redistribution of the recollected taxes among the territories’ budgets (this approach is also known as a tax policy. To resume, this paper describes comprehensive models of financial regulation of the socioeconomic territorial development that can help in estimating and choosing the right financial policy parameters. These provide the stable rates of the growth of national economies along with a simultaneous decrease in interregional socioeconomic disproportions.

  7. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    Science.gov (United States)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  8. Development of reactor design aid tool using virtual reality technology

    International Nuclear Information System (INIS)

    A new type of aid system for fusion reactor design, to which the virtual reality (VR) visualization and sonification techniques are applied, is developed. This system provides us with an intuitive interaction environment in the VR space between the observer and the designed objects constructed by the conventional 3D computer-aided design (CAD) system. We have applied the design aid tool to the heliotron-type fusion reactor design activity FFHR2m [A. Sagara, S. Imagawa, O. Mitarai, T. Dolan, T. Tanaka, Y. Kubota, et al., Improved structure and long -life blanket concepts for heliotron reactors, Nucl. Fusion 45 (2005) 258-263] on the virtual reality system CompleXcope [Y. Tamura, A. Kageyama, T. Sato, S. Fujiwara, H. Nakamura, Virtual reality system to visualize and auralize numerical imulation data, Comp. Phys. Comm. 142 (2001) 227-230] of the National Institute for Fusion Science, Japan, and have evaluated its performance. The tool includes the functions of transfer of the observer, translation and scaling of the objects, recording of the operations and the check of interference

  9. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    Science.gov (United States)

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals.

  10. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    Science.gov (United States)

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. PMID:27429366

  11. Requirements for Product Development Self-Assessment Tools

    OpenAIRE

    Knoblinger, Christoph; Oehmen, Josef; Rebentisch, Eric; Seering, Warren; Helten, Katharina

    2011-01-01

    The successful execution of complex PD projects still poses major challenges for companies. One approach companies can use to improve their performance is self-assessment tools to optimize their organization and processes. This paper investigates the requirements regarding self-assessment tools for PD organizations. It summarizes the current literature on PD-related self-assessment tools and derives tool requirements from an industry focus group (US aerospace and defense industry) as well as ...

  12. Development of analytical techniques for the characterization of natural and anthropogenic compounds in fine particulate matter

    OpenAIRE

    Piazzalunga,

    2007-01-01

    Aerosol is of central importance for atmospheric chemistry and physics, for the biosphere, the climate and public health. The primary parameters that determine the environmental and health effects of aerosol particles are their concentration and chemical composition. In this work we have developed the analytical techniques to study particulate matter composition. The knowledge of PM composition can be useful to identify the main PM sources, the health risk and the formation or depositio...

  13. DEVELOPMENT OF ANALYTICAL METHODS IN METABOLOMICS FOR THE STUDY OF HEREDITARY AND ACQUIRED GENETIC DISEASE

    OpenAIRE

    Arvonio, Raffaele

    2011-01-01

    METABOLOMICS AND MASS SPECTROMETRY The research project take place in the branch of metabolomics, which involves the systematic study of the metabolites present in a cell and in this area MS, thanks to its potential to carry out controlled experiments of fragmentation, plays a role as a key methodology for identification of various metabolites. The work of thesis project is focused on the analytical methods development for the diagnosis of metabolic diseases and is divided as follows: ...

  14. The Development of Bio-Analytical Techniques for the Treatment of Psoriasis and Related Skin Disorders.

    OpenAIRE

    Hollywood, Katherine

    2010-01-01

    AbstractThe University of ManchesterKatherine Anne Hollywood: June 2010Degree of Doctor of Philosophy in the Faculty of Engineering and Physical SciencesThe Development of Bio-Analytical Techniques for the Treatment of Psoriasis and Related Skin Disorders.In this investigation a number of post-genomic technologies have be applied to study the dermatological disorders of psoriasis and keloid disease. In spite of considerable research focus on these diseases the pathogenesis remains unclear an...

  15. Programming of development of commune with utilization of analytic hierarchic process

    Directory of Open Access Journals (Sweden)

    Aleksandra Łuczak

    2010-01-01

    Full Text Available The paper is a trial of application of analytical hierarchy process to work out scenarios of development of rural commune of Wielkopolska province. In the proposed method consists in building a hierarchical scheme. The scheme covers the general goal which is ensure best by the multi-functional development in administrative district, specific (basic goals and within each goal, a package of a activities can be distinguished which were the basis to work out scenarios of the development of commune. It allowed to choose the best scenario for the administrative district.

  16. Development of a simple estimation tool for LMFBR construction cost

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Kazuo; Kinoshita, Izumi [Central Research Inst. of Electric Power Industry, Komae, Tokyo (Japan). Komae Research Lab

    1999-05-01

    A simple tool for estimating the construction costs of liquid-metal-cooled fast breeder reactors (LMFBRs), 'Simple Cost' was developed in this study. Simple Cost is based on a new estimation formula that can reduce the amount of design data required to estimate construction costs. Consequently, Simple cost can be used to estimate the construction costs of innovative LMFBR concepts for which detailed design has not been carried out. The results of test calculation show that Simple Cost provides cost estimations equivalent to those obtained with conventional methods within the range of plant power from 325 to 1500 MWe. Sensitivity analyses for typical design parameters were conducted using Simple Cost. The effects of four major parameters - reactor vessel diameter, core outlet temperature, sodium handling area and number of secondary loops - on the construction costs of LMFBRs were evaluated quantitatively. The results show that the reduction of sodium handling area is particularly effective in reducing construction costs. (author)

  17. Developing a Grid-based search and categorization tool

    CERN Document Server

    Haya, Glenn; Vigen, Jens

    2003-01-01

    Grid technology has the potential to improve the accessibility of digital libraries. The participants in Project GRACE (Grid Search And Categorization Engine) are in the process of developing a search engine that will allow users to search through heterogeneous resources stored in geographically distributed digital collections. What differentiates this project from current search tools is that GRACE will be run on the European Data Grid, a large distributed network, and will not have a single centralized index as current web search engines do. In some cases, the distributed approach offers advantages over the centralized approach since it is more scalable, can be used on otherwise inaccessible material, and can provide advanced search options customized for each data source.

  18. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  19. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    Science.gov (United States)

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix

  20. Development of AN All-Purpose Free Photogrammetric Tool

    Science.gov (United States)

    González-Aguilera, D.; López-Fernández, L.; Rodriguez-Gonzalvez, P.; Guerrero, D.; Hernandez-Lopez, D.; Remondino, F.; Menna, F.; Nocerino, E.; Toschi, I.; Ballabeni, A.; Gaiani, M.

    2016-06-01

    Photogrammetry is currently facing some challenges and changes mainly related to automation, ubiquitous processing and variety of applications. Within an ISPRS Scientific Initiative a team of researchers from USAL, UCLM, FBK and UNIBO have developed an open photogrammetric tool, called GRAPHOS (inteGRAted PHOtogrammetric Suite). GRAPHOS allows to obtain dense and metric 3D point clouds from terrestrial and UAV images. It encloses robust photogrammetric and computer vision algorithms with the following aims: (i) increase automation, allowing to get dense 3D point clouds through a friendly and easy-to-use interface; (ii) increase flexibility, working with any type of images, scenarios and cameras; (iii) improve quality, guaranteeing high accuracy and resolution; (iv) preserve photogrammetric reliability and repeatability. Last but not least, GRAPHOS has also an educational component reinforced with some didactical explanations about algorithms and their performance. The developments were carried out at different levels: GUI realization, image pre-processing, photogrammetric processing with weight parameters, dataset creation and system evaluation. The paper will present in detail the developments of GRAPHOS with all its photogrammetric components and the evaluation analyses based on various image datasets. GRAPHOS is distributed for free for research and educational needs.

  1. Analytical solution of electromagnetic field generated by induction logging tool in a fan-ring slot of drill collar while drilling

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Based on the structural characteristic of metal drill collar for induction logging while drilling, we have given the analytical formulae of lengthways fields Ez and Hz when the tool is located in a fan-ring shaped slot of drill collar by the boundary conditions of electromagnetic field, and derived the other components of electromagnetic field in and out the fan-ring slot from Ez and Hz. In the other intervals of formation, where the drill collar is a solid cylinder, the analytical formulae of field are educed through the method of variable coefficient. The total analytical solutions of field in whole space have been obtained. With the help of the analytical formulae, we have also given numerical examples and analyzed the distributive characteristic of electromagnetic field. From the computational results we find that the secondary scattering field Hz is in a linear relation with the conductivity of stratum. The characteristic of field is very useful for induction logging while drilling, which can be used to measure and analyze the logging responses of the stratum conductivity. This paper sets up a theoretical foundation for us to study the distrbutions of field and to direct the design of logging instruments.

  2. The development of a two-component force dynamometer and tool control system for dynamic machine tool research

    Science.gov (United States)

    Sutherland, I. A.

    1973-01-01

    The development is presented of a tooling system that makes a controlled sinusoidal oscillation simulating a dynamic chip removal condition. It also measures the machining forces in two mutually perpendicular directions without any cross sensitivity.

  3. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  4. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  5. Laser metrology — a diagnostic tool in automotive development processes

    Science.gov (United States)

    Beeck, Manfred-Andreas; Hentschel, Werner

    2000-08-01

    Laser measurement techniques are widely used in automotive development processes. Applications at Volkswagen are presented where laser metrology works as a diagnostic tool for analysing and optimising complex coupled processes inside and between automotive components and structures such as the reduction of a vehicle's interior or outer acoustic noise, including brake noise, and the combustion analysis for diesel and gasoline engines to further reduce fuel consumption and pollution. Pulsed electronic speckle pattern interferometry (ESPI) and holographic interferometry are used for analysing the knocking behaviour of modern engines and for correct positioning of knocking sensors. Holographic interferometry shows up the vibrational behaviour of brake components and their interaction during braking, and allows optimisation for noise-free brake systems. Scanning laser vibrometry analyses structure-born noise of a whole car body for the optimisation of its interior acoustical behaviour.Modern engine combustion concepts such as in direct-injection (DI) gasoline and diesel engines benefit from laser diagnostic tools which permit deeper insight into the in-cylinder processes such as flow generation, fuel injection and spray formation, atomisation and mixing, ignition and combustion, and formation and reduction of pollutants. The necessary optical access inside a cylinder is realised by so-called 'transparent engines' allowing measurements nearly during the whole engine cycle. Measurement techniques and results on double-pulse particle image velocimetry (PIV) with a frequency-doubled YAG laser for in-cylinder flow analysis are presented, as well as Mie-scattering on droplets using a copper vapour laser combined with high-speed filming, and laser-induced fluorescence (LIF) with an excimer laser for spray and fuel vapour analysis.

  6. A numerical tool for the development of solar wood dryers

    Energy Technology Data Exchange (ETDEWEB)

    Oueslati, M.M. [Centre de Recherche et des Technologies de l' Energie, Hammam Lif (Tunisia). Laboratoire Energetique et procedes Thermiques; Guellouz, M.S. [Ecole Nationale d' Ingenieurs de Monastir, Monastir (Tunisia). Laboratoire d' Etudes des Systemes Thermiques et Energetiques

    2010-07-01

    In order to reduce the energy demand associated with wood drying, conventional fossil fuels can be substituted with renewable energy. Solar energy is an appropriate alternative in Tunisia, where the wood furniture industry is dominated by small artisans. In order to improve product quality, small scale affordable wood dryers have to be made accessible to these artisans. The overall objective of this study was to minimize energy consumption of industrial wood drying and offer small businesses access to low cost solar dryers. The amount of energy required to evaporate water from wood and the drying time length depend on the type of equipment and technology used as well as the wood type and thickness. In order to design an optimal solar dryer, a numerical tool was developed using Fortran 90 to simulate the drying of a wood stack. The numerical tool predicts the time evolution of the wood and air properties and calculates the energy needed for the process. It also describes the space-time variation of heat and mass transport in the wood, in the drying air and at their interface. A finite difference scheme was used to discretize the equations. When applied in a representative case of drying a 2 cubic metre wood stack, the model highlighted the advantages of using industry established drying schedules rather than using constant temperature drying air. It also illustrated the advantages of airflow reversals to uniformly distribute the moisture content in the wood pile. The results from this study were used to estimate the required area of solar air collectors to provide the drying thermal energy. The study showed that using a drying schedule with an hourly flow reversal improved the drying quality and shortened the drying time. 11 refs., 1 tab., 10 figs.

  7. Developing Anticipatory Life Cycle Assessment Tools to Support Responsible Innovation

    Science.gov (United States)

    Wender, Benjamin

    Several prominent research strategy organizations recommend applying life cycle assessment (LCA) early in the development of emerging technologies. For example, the US Environmental Protection Agency, the National Research Council, the Department of Energy, and the National Nanotechnology Initiative identify the potential for LCA to inform research and development (R&D) of photovoltaics and products containing engineered nanomaterials (ENMs). In this capacity, application of LCA to emerging technologies may contribute to the growing movement for responsible research and innovation (RRI). However, existing LCA practices are largely retrospective and ill-suited to support the objectives of RRI. For example, barriers related to data availability, rapid technology change, and isolation of environmental from technical research inhibit application of LCA to developing technologies. This dissertation focuses on development of anticipatory LCA tools that incorporate elements of technology forecasting, provide robust explorations of uncertainty, and engage diverse innovation actors in overcoming retrospective approaches to environmental assessment and improvement of emerging technologies. Chapter one contextualizes current LCA practices within the growing literature articulating RRI and identifies the optimal place in the stage gate innovation model to apply LCA. Chapter one concludes with a call to develop anticipatory LCA---building on the theory of anticipatory governance---as a series of methodological improvements that seek to align LCA practices with the objectives of RRI. Chapter two provides a framework for anticipatory LCA, identifies where research from multiple disciplines informs LCA practice, and builds off the recommendations presented in the preceding chapter. Chapter two focuses on crystalline and thin film photovoltaics (PV) to illustrate the novel framework, in part because PV is an environmentally motivated technology undergoing extensive R&D efforts and

  8. Development and assessment of the Alberta Context Tool

    Directory of Open Access Journals (Sweden)

    Birdsell Judy M

    2009-12-01

    Full Text Available Abstract Background The context of healthcare organizations such as hospitals is increasingly accepted as having the potential to influence the use of new knowledge. However, the mechanisms by which the organizational context influences evidence-based practices are not well understood. Current measures of organizational context lack a theory-informed approach, lack construct clarity and generally have modest psychometric properties. This paper presents the development and initial psychometric validation of the Alberta Context Tool (ACT, an eight dimension measure of organizational context for healthcare settings. Methods Three principles guided the development of the ACT: substantive theory, brevity, and modifiability. The Promoting Action on Research Implementation in Health Services (PARiHS framework and related literature were used to guide selection of items in the ACT. The ACT was required to be brief enough to be tolerated in busy and resource stretched work settings and to assess concepts of organizational context that were potentially modifiable. The English version of the ACT was completed by 764 nurses (752 valid responses working in seven Canadian pediatric care hospitals as part of its initial validation. Cronbach's alpha, exploratory factor analysis, analysis of variance, and tests of association were used to assess instrument reliability and validity. Results Factor analysis indicated a 13-factor solution (accounting for 59.26% of the variance in 'organizational context'. The composition of the factors was similar to those originally conceptualized. Cronbach's alpha for the 13 factors ranged from .54 to .91 with 4 factors performing below the commonly accepted alpha cut off of .70. Bivariate associations between instrumental research utilization levels (which the ACT was developed to predict and the ACT's 13 factors were statistically significant at the 5% level for 12 of the 13 factors. Each factor also showed a trend of

  9. Analytical Method Development and Validation of Related Substance Method for Bortezomib for Injection 3.5 mg/Vial by RP-HPLC Method

    Directory of Open Access Journals (Sweden)

    Utage M

    2013-04-01

    Full Text Available An accurate, precise, simple and economical High Performance Liquid Chromatographic method for therelated substance determination of Bortezomib in its lyophilized dosage form has been developed. Themethod developed is Reverse Phase High Performance Liquid Chromatographic method using HypersilBDS C18 column (Length: 150mm, Diameter: 4.6mm, Particle size: 5μ with Gradient programmed anda simple Acetonitrile, Water and Formic acid in the ratio of 30:70:0.1 (v/v/v respectively as mobilephase A and Acetonitrile, Water and Formic acid in the ratio of 80:20:0.1 (v/v/v respectively. Themethod so developed was validated in compliance with the regulatory guidelines by using welldeveloped analytical method validation tool which comprises with the analytical method validationparameters like Linearity, Accuracy, Method precision, Specificity with forced degradation, Systemsuitability, Robustness, LOD, LOQ and Ruggedness. The results obtained were well within theacceptance criteria.

  10. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    OpenAIRE

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatogr...

  11. Knowledge based process development of bobbin tool friction stir welding

    OpenAIRE

    Hilgert, Jakob

    2012-01-01

    Over the last twenty years Friction Stir Welding (FSW) has proven to be a very promising new joining technique. Especially high strength aluminium alloys can be welded with large advantages as compared to conventional fusion welding processes. For some joint configurations and desired applications bobbin tool welding is a process variant that can circumvent limitations arising from the high process forces in conventional tool FSW. As bobbin tools are highly mechanically loaded, in-depth under...

  12. The Bristol Radiology Report Assessment Tool (BRRAT): Developing a workplace-based assessment tool for radiology reporting skills

    International Nuclear Information System (INIS)

    Aim: To review the development of a workplace-based assessment tool to assess the quality of written radiology reports and assess its reliability, feasibility, and validity. Materials and methods: A comprehensive literature review and rigorous Delphi study enabled the development of the Bristol Radiology Report Assessment Tool (BRRAT), which consists of 19 questions and a global assessment score. Three assessors applied the assessment tool to 240 radiology reports provided by 24 radiology trainees. Results: The reliability coefficient for the 19 questions was 0.79 and the equivalent coefficient for the global assessment scores was 0.67. Generalizability coefficients demonstrate that higher numbers of assessors and assessments are needed to reach acceptable levels of reliability for summative assessments due to assessor subjectivity. Conclusion: The study methodology gives good validity and strong foundation in best-practice. The assessment tool developed for radiology reporting is reliable and most suited to formative assessments

  13. A Participatory Approach to Develop the Power Mobility Screening Tool and the Power Mobility Clinical Driving Assessment Tool

    Directory of Open Access Journals (Sweden)

    Deepan C. Kamaraj

    2014-01-01

    Full Text Available The electric powered wheelchair (EPW is an indispensable assistive device that increases participation among individuals with disabilities. However, due to lack of standardized assessment tools, developing evidence based training protocols for EPW users to improve driving skills has been a challenge. In this study, we adopt the principles of participatory research and employ qualitative methods to develop the Power Mobility Screening Tool (PMST and Power Mobility Clinical Driving Assessment (PMCDA. Qualitative data from professional experts and expert EPW users who participated in a focus group and a discussion forum were used to establish content validity of the PMCDA and the PMST. These tools collectively could assess a user’s current level of bodily function and their current EPW driving capacity. Further multicenter studies are necessary to evaluate the psychometric properties of these tests and develop EPW driving training protocols based on these assessment tools.

  14. Analytical development and optimization of a graphene-solution interface capacitance model.

    Science.gov (United States)

    Karimi, Hediyeh; Rahmani, Rasoul; Mashayekhi, Reza; Ranjbari, Leyla; Shirdel, Amir H; Haghighian, Niloofar; Movahedi, Parisa; Hadiyan, Moein; Ismail, Razali

    2014-01-01

    Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs) is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs) along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE) is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO) algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy. PMID:24991496

  15. Geochemical fingerprinting: 40 years of analytical development and real world applications

    International Nuclear Information System (INIS)

    Geochemical fingerprinting is a rapidly expanding discipline in the earth and environmental sciences. It is anchored in the recognition that geological processes leave behind chemical and isotopic patterns in the rock record. Many of these patterns, informally referred to as geochemical fingerprints, differ only in fine detail from each other. For this reason, the approach of fingerprinting requires analytical data of very high precision and accuracy. It is not surprising that the advancement of geochemical fingerprinting occurred alongside progress in geochemical analysis techniques. In this brief treatment, a subjective selection of drivers behind the analytical progress and its implications for geochemical fingerprinting are discussed. These include the impact of the Apollo lunar sample return program on quality of geochemical data and its push towards minimizing required sample volumes. The advancement of in situ analytical techniques is also identified as a major factor that has enabled geochemical fingerprinting to expand into a larger variety of fields. For real world applications of geochemical fingerprinting, in which large sample throughput, reasonable cost, and fast turnaround are key requirements, the improvements to inductively-coupled-plasma quadrupole mass spectrometry were paramount. The past 40 years have witnessed how geochemical fingerprinting has found its way into everyday applications. This development is cause for celebrating the 40 years of existence of the IAGC.

  16. Analytical development and optimization of a graphene–solution interface capacitance model

    Directory of Open Access Journals (Sweden)

    Hediyeh Karimi

    2014-05-01

    Full Text Available Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy.

  17. Developing the role of big data and analytics in health professional education.

    Science.gov (United States)

    Ellaway, Rachel H; Pusic, Martin V; Galbraith, Robert M; Cameron, Terri

    2014-03-01

    As we capture more and more data about learners, their learning, and the organization of their learning, our ability to identify emerging patterns and to extract meaning grows exponentially. The insights gained from the analyses of these large amounts of data are only helpful to the extent that they can be the basis for positive action such as knowledge discovery, improved capacity for prediction, and anomaly detection. Big Data involves the aggregation and melding of large and heterogeneous datasets while education analytics involves looking for patterns in educational practice or performance in single or aggregate datasets. Although it seems likely that the use of education analytics and Big Data techniques will have a transformative impact on health professional education, there is much yet to be done before they can become part of mainstream health professional education practice. If health professional education is to be accountable for its programs run and are developed, then health professional educators will need to be ready to deal with the complex and compelling dynamics of analytics and Big Data. This article provides an overview of these emerging techniques in the context of health professional education.

  18. CRMS vegetation analytical team framework: Methods for collection, development, and use of vegetation response variables

    Science.gov (United States)

    Cretini, Kari F.; Visser, Jenneke M.; Krauss, Ken W.; Steyer, Gregory D.

    2011-01-01

    This document identifies the main objectives of the Coastwide Reference Monitoring System (CRMS) vegetation analytical team, which are to provide (1) collection and development methods for vegetation response variables and (2) the ways in which these response variables will be used to evaluate restoration project effectiveness. The vegetation parameters (that is, response variables) collected in CRMS and other coastal restoration projects funded under the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) are identified, and the field collection methods for these parameters are summarized. Existing knowledge on community and plant responses to changes in environmental drivers (for example, flooding and salinity) from published literature and from the CRMS and CWPPRA monitoring dataset are used to develop a suite of indices to assess wetland condition in coastal Louisiana. Two indices, the floristic quality index (FQI) and a productivity index, are described for herbaceous and forested vegetation. The FQI for herbaceous vegetation is tested with a long-term dataset from a CWPPRA marsh creation project. Example graphics for this index are provided and discussed. The other indices, an FQI for forest vegetation (that is, trees and shrubs) and productivity indices for herbaceous and forest vegetation, are proposed but not tested. New response variables may be added or current response variables removed as data become available and as our understanding of restoration success indicators develops. Once indices are fully developed, each will be used by the vegetation analytical team to assess and evaluate CRMS/CWPPRA project and program effectiveness. The vegetation analytical teams plan to summarize their results in the form of written reports and/or graphics and present these items to CRMS Federal and State sponsors, restoration project managers, landowners, and other data users for their input.

  19. The interaction of syntax, semantics & pragmatics in grammars: the development of analytic tools in modern linguistics

    Directory of Open Access Journals (Sweden)

    Robert D. Van Valin Junior

    2001-02-01

    Full Text Available

    One of the primary tasks facing a grammatical theory is to capture the interaction of syntax, semantics and pragmatics in linguistic systems. This is essential if linguistic theory is to explain the communicative functions of grammatical structures in particular languages and across languages. The questions which must be answered include: what is the appropriate universally valid representation for syntactic structure?, what would be an adequate representation of crucial aspects of the semantics of propositions?, how can discourse-pragmatic information be represented in a grammatically relevant way, and, most important, how do these different representations interact with each other? In this paper answers to these questions will be given in terms of Role and Reference Grammar (Van Valin, 1993; Van Valin & La Polla, 1997.

  20. Integrating Fourth-Generation Tools Into the Applications Development Environment.

    Science.gov (United States)

    Litaker, R. G.; And Others

    1985-01-01

    Much of the power of the "information center" comes from its ability to effectively use fourth-generation productivity tools to provide information processing services. A case study of the use of these tools at Western Michigan University is presented. (Author/MLW)

  1. Towards the Development of Web-based Business intelligence Tools

    DEFF Research Database (Denmark)

    Georgiev, Lachezar; Tanev, Stoyan

    2011-01-01

    This paper focuses on using web search techniques in examining the co-creation strategies of technology driven firms. It does not focus on the co-creation results but describes the implementation of a software tool using data mining techniques to analyze the content on firms’ websites. The tool w...

  2. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Science.gov (United States)

    2010-07-01

    ... Safety, NEPA 101A) should be used to support the life safety equivalency evaluation. If fire modeling is... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and...

  3. Development of micro machining tools for finishing weld joint

    International Nuclear Information System (INIS)

    GE, Hitachi and Toshiba are jointly constructing advanced boiling water reactor (ABWR) Units 6 and 7 at Kashiwazaki Kariwa Nuclear Power Plant Station, Tokyo Electric Power Co. The ABWR features enhanced operability and safety as a whole plant through simplicity and improved performance. To achieve these improvement, one of the key features of technical innovation adopted in the ABWR design, ten reactor internal pumps (RIP) are adopted as the reactor recirculation system. The RIP casing to hold the RIP constituting the primary pressure boundary together with a RPV is welded to the nozzle on a RPV lower shell with Gas Tungsten Arc Welding (GTAW). The welding is on V-groove using automatic GTAW technique from the inside of the casing. The penetration bead (the back side of the weld) therefore needs to be finished with machining tools to inspect the qualification of the welding. This paper summarizes the development of the special purpose micro machines which are installed inside the narrow gap being provided between the RIP casing and the RPV (skirt) to finish the penetration bead. (author)

  4. Analytic Materials

    CERN Document Server

    Milton, Graeme W

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer $p$. If $p$ takes its maximum value then we have a complete analytic material. Otherwise it is incomplete analytic material of rank $p$. For two-dimensional materials further progress can be made in the identification of analytic materials by using the well-known fact that a $90^\\circ$ rotation applied to a divergence free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  5. Big Data Analytics in Healthcare

    OpenAIRE

    Ashwin Belle; Raghuram Thiagarajan; S. M. Reza Soroushmehr; Fatemeh Navidi; Daniel A Beard; Kayvan Najarian

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is sti...

  6. Development of Genetic Tools for the Manipulation of the Planctomycetes

    Science.gov (United States)

    Rivas-Marín, Elena; Canosa, Inés; Santero, Eduardo; Devos, Damien P.

    2016-01-01

    Bacteria belonging to the Planctomycetes, Verrucomicrobia, Chlamydiae (PVC) superphylum are of interest for biotechnology, evolutionary cell biology, ecology, and human health. Some PVC species lack a number of typical bacterial features while others possess characteristics that are usually more associated to eukaryotes or archaea. For example, the Planctomycetes phylum is atypical for the absence of the FtsZ protein and for the presence of a developed endomembrane system. Studies of the cellular and molecular biology of these infrequent characteristics are currently limited due to the lack of genetic tools for most of the species. So far, genetic manipulation in Planctomycetes has been described in Planctopirus limnophila only. Here, we show a simple approach that allows mutagenesis by homologous recombination in three different planctomycetes species (i.e., Gemmata obscuriglobus, Gimesia maris, and Blastopirellula marina), in addition to P. limnophila, thus extending the repertoire of genetically modifiable organisms in this superphylum. Although the Planctomycetes show high resistance to most antibiotics, we have used kanamycin resistance genes in G. obscuriglobus, P. limnophila, and G. maris, and tetracycline resistance genes in B. marina, as markers for mutant selection. In all cases, plasmids were introduced in the strains by mating or electroporation, and the genetic modification was verified by Southern Blotting analysis. In addition, we show that the green fluorescent protein (gfp) is expressed in all four backgrounds from an Escherichia coli promoter. The genetic manipulation achievement in four phylogenetically diverse planctomycetes will enable molecular studies in these strains, and opens the door to developing genetic approaches not only in other planctomycetes but also other species of the superphylum, such as the Lentisphaerae. PMID:27379046

  7. Development of Genetic Tools for the Manipulation of the Planctomycetes.

    Science.gov (United States)

    Rivas-Marín, Elena; Canosa, Inés; Santero, Eduardo; Devos, Damien P

    2016-01-01

    Bacteria belonging to the Planctomycetes, Verrucomicrobia, Chlamydiae (PVC) superphylum are of interest for biotechnology, evolutionary cell biology, ecology, and human health. Some PVC species lack a number of typical bacterial features while others possess characteristics that are usually more associated to eukaryotes or archaea. For example, the Planctomycetes phylum is atypical for the absence of the FtsZ protein and for the presence of a developed endomembrane system. Studies of the cellular and molecular biology of these infrequent characteristics are currently limited due to the lack of genetic tools for most of the species. So far, genetic manipulation in Planctomycetes has been described in Planctopirus limnophila only. Here, we show a simple approach that allows mutagenesis by homologous recombination in three different planctomycetes species (i.e., Gemmata obscuriglobus, Gimesia maris, and Blastopirellula marina), in addition to P. limnophila, thus extending the repertoire of genetically modifiable organisms in this superphylum. Although the Planctomycetes show high resistance to most antibiotics, we have used kanamycin resistance genes in G. obscuriglobus, P. limnophila, and G. maris, and tetracycline resistance genes in B. marina, as markers for mutant selection. In all cases, plasmids were introduced in the strains by mating or electroporation, and the genetic modification was verified by Southern Blotting analysis. In addition, we show that the green fluorescent protein (gfp) is expressed in all four backgrounds from an Escherichia coli promoter. The genetic manipulation achievement in four phylogenetically diverse planctomycetes will enable molecular studies in these strains, and opens the door to developing genetic approaches not only in other planctomycetes but also other species of the superphylum, such as the Lentisphaerae. PMID:27379046

  8. Development of Tools and Techniques for Processing STORRM Flight Data

    Science.gov (United States)

    Robinson, Shane; D'Souza, Christopher

    2011-01-01

    While at JSC for the summer of 2011, I was assigned to work on the sensor test for Orion relative-navigation risk mitigation (STORRM) development test objective (DTO). The STORRM DTO was flown on-board Endeavor during STS-134. The objective of the STORRM DTO is to test the visual navigation system (VNS), which will be used as the primary relative navigation sensor for the Orion spacecraft. The VNS is a flash lidar system intended to provide both line of sight and range information during rendezvous and proximity operations. The STORRM DTO also serves as a testbed for the high-resolution docking camera. This docking camera will be used to provide piloting cues for the crew during proximity operations. These instruments were mounted next to the trajectory control sensor (TCS) in Endeavour s payload bay. My principle objective for the summer was to generate a best estimated trajectory (BET) for Endeavor using the flight data collected by the VNS during rendezvous and the unprecedented re-rendezvous with the ISS. I processed the raw images from the VNS to produce range and bearing measurements. I then aggregated these measurements and extracted the measurements corresponding to individual reflectors. I combined the information contained in these measurements with data from the Endeavour's inertial sensors using Kalman smoothing techniques to ultimately produce a BET. This work culminated with a final presentation of the result to division management. Development of this tool required that traditional linear smoothing techniques be modified in a novel fashion to permit for the inclusion of non-linear measurements. This internship has greatly helped me further my career by providing exposure to real engineering projects. I also have benefited immensely from the mentorship of the engineers working on these projects. Many of the lessons I learned and experiences I had are of particular value because then can only be found in a place like JSC.

  9. Determination of thin noble metal layers using laser ablation ICP-MS: An analytical tool for NobleChem technology

    International Nuclear Information System (INIS)

    Intergranular stress corrosion cracking (SCC) of reactor internals and recirculation piping is a matter of concern in boiling water reactors (BWR). SCC is basically an anodic dissolution of the metal grain boundaries if these are susceptible either because of the failure to stress relieve welds in un-stabilized steel where the grain boundaries become depleted in chromium, or under irradiation where migration of chromium and other impurities away from or to the grain boundaries renders them sensitive to dissolution. To mitigate SCC, the electrochemical corrosion potential (ECP) of the structural materials in the BWR environment needs to be lowered 2 and H2O2 by the injection of a sufficiently large amount of H2 to the feedwater. This technique can be very effective, but it has the undesirable side effect of increasing the radiation level in the main steam by a factor of 4 to 5. NobleChem has been developed and patented by General Electric Company and is a more effective method of achieving a low ECP value at lower hydrogen injection rates without negative side effects of HWC. In this process noble metals (Pt, Rh) are injected into the feedwater (typically during the reactor shut-down), which then deposit on the structural component surfaces and on fuel. Noble metals are electrocatalysts that efficiently recombine O2 and H2O2 with H2 on the metal surface. With NobleChem/Low HWC, the component surface oxidant concentration becomes zero as soon as the bulk reactor water reaches a stoichiometric excess hydrogen condition. The SCC mitigation effectiveness of NobleChem is crucially dependent on achieving a sufficiently high noble metal concentration of ca. 0.1 μg/cm2 on the critical component and crack flank surfaces. In order to study and understand the transport, (re-)distribution and deposition behaviour of the noble metals in the reactor coolant circuit and to control the SCC mitigation effectiveness of NobleChem, analytical methods determining the local Pt and Rh

  10. An Approach to Building a Traceability Tool for Software Development

    Science.gov (United States)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    specifications, design reports, and system code. Tracing helps 1) validate system features against, the requirement specification, 2) identify error sources and, most importantly, 3) manage change. With so many people involved in the development of the system, it becomes necessary to identify the reasons behind the design requirements or the implementation decisions. This paper is concerned with an approach that maps documents to constraints that capture properties of and relationships between the objects being modeled by the program. Section 2 provides the reader with a background on traceability tools. Section 3 gives a brief description of the context monitoring system on which the approach suggested in this paper is based. Section 4 presents an overview of our approach to providing traceability. The last section presents our future direction of research.

  11. Selection, Development and Results for The RESOLVE Regolith Volatiles Characterization Analytical System

    Science.gov (United States)

    Lueck, Dale E.; Captain, Janine E.; Gibson, Tracy L.; Peterson, Barbara V.; Berger, Cristina M.; Levine, Lanfang

    2008-01-01

    The RESOLVE project requires an analytical system to identify and quantitate the volatiles released from a lunar drill core sample as it is crushed and heated to 150 C. The expected gases and their range of concentrations were used to assess Gas Chromatography (GC) and Mass Spectrometry (MS), along with specific analyzers for use on this potential lunar lander. The ability of these systems to accurately quantitate water and hydrogen in an unknown matrix led to the selection of a small MEMS commercial process GC for use in this project. The modification, development and testing of this instrument for the specific needs of the project is covered.

  12. Forming and actualization of cognitive motives as means for development of students' analytical thinking.

    Directory of Open Access Journals (Sweden)

    Shevchenko Svetlana Nikolaevna

    2011-10-01

    Full Text Available Considered different approaches to understanding the concepts of motivation and motive. Species analyzed motives of educational activity. Established that cognitive motives are most effective for the development of analytical thinking of students. The study used data from test 1-4 grade students. An interconnection between the level of academic achievement and student motivation level of its training. Isolated areas of forming and maintaining cognitive motives of students in the learning process. It is established that the formation and activation of the cognitive motivation of students affected: the content of educational material, organizing training activities, style of teaching. Each component provides the motivational aspect of students to study.

  13. Development of an analytical model to assess fuel property effects on combustor performance

    Science.gov (United States)

    Sutton, R. D.; Troth, D. L.; Miles, G. A.; Riddlebaugh, S. M.

    1987-01-01

    A generalized first-order computer model has been developed in order to analytically evaluate the potential effect of alternative fuels' effects on gas turbine combustors. The model assesses the size, configuration, combustion reliability, and durability of the combustors required to meet performance and emission standards while operating on a broad range of fuels. Predictions predicated on combustor flow-field determinations by the model indicate that fuel chemistry, as defined by hydrogen content, exerts a significant influence on flame retardation, liner wall temperature, and smoke emission.

  14. Development of an analytical method for estimating the composition of NOx gas using ion chromatography

    International Nuclear Information System (INIS)

    The choice of solvent for reprocessing the spent nuclear fuel by aqueous route is nitric acid. Hence the presence of NOx gas in all the off-gas streams is inevitable. Estimation of the composition of these gases is very important to evaluate the reaction mechanism of the dissolution step. This article briefly explains an analytical method developed for estimating the composition of NOx gas by ion chromatography during the reaction between sodium nitrate and nitric acid which can be extended for reprocessing applications in the PUREX dissolver system with necessary changes. (author)

  15. The Role Dafachronic Acid Signaling in Development and Longevity in Caenorhabditis elegans: Digging Deeper Using Cutting Edge Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Hugo eAguilaniu

    2016-02-01

    Full Text Available Steroid hormones regulate physiological processes in species ranging from plants to humans. A wide range of steroid hormones exist, and their contributions to processes such as growth, reproduction, development, and aging, is almost always complex. Understanding the biosynthetic pathways that generate steroid hormones and the signaling pathways that mediate their effects is thus of fundamental importance. In this work, we review recent advances in (i the biological role of steroid hormones in the roundworm Caenorhabditis elegans and (ii the development of novel methods to facilitate the detection and identification of these molecules. Our current understanding of steroid signaling in this simple organism serves to illustrate the challenges we face moving forward. First, it seems clear that we have not yet identified all of the enzymes responsible for steroid biosynthesis and/or degradation. Second, perturbation of steroid signaling affects a wide range of phenotypes, and subtly different steroid molecules can have distinct effects. Finally, steroid hormone levels are critically important, and minute variations in quantity can profoundly impact a phenotype. Thus, it is imperative that we develop innovative analytical tools and combine them with cutting-edge approaches such as comprehensive and highly selective liquid chromatography coupled to mass spectrometry (LC-MS based or new methods such as supercritical fluid chromatography coupled to mass spectrometry (SFC-MS if we are to obtain a better understanding of the biological functions of steroid signaling.

  16. Progress report on the development of remotely operated tools

    International Nuclear Information System (INIS)

    This report contains a number of individual trials reports based upon work conducted in aid of a programme of feasibility studies into the size reduction of radioactive contaminated solid waste. The work was directed towards the identification of acceptable remotely operated tools and the means of deploying them for dismantling operations in a radioactive environment. Reliability, ease of maintenance, change of tool bits and common power sources have been major considerations in the trials assessments. Alternative end effector drive systems have also been considered when defining suitable manipulative capabilities and attention has also been directed towards a remotely controlled tool changing capability. (author)

  17. Doing social media analytics

    Directory of Open Access Journals (Sweden)

    Phillip Brooker

    2016-07-01

    Full Text Available In the few years since the advent of ‘Big Data’ research, social media analytics has begun to accumulate studies drawing on social media as a resource and tool for research work. Yet, there has been relatively little attention paid to the development of methodologies for handling this kind of data. The few works that exist in this area often reflect upon the implications of ‘grand’ social science methodological concepts for new social media research (i.e. they focus on general issues such as sampling, data validity, ethics, etc.. By contrast, we advance an abductively oriented methodological suite designed to explore the construction of phenomena played out through social media. To do this, we use a software tool – Chorus – to illustrate a visual analytic approach to data. Informed by visual analytic principles, we posit a two-by-two methodological model of social media analytics, combining two data collection strategies with two analytic modes. We go on to demonstrate each of these four approaches ‘in action’, to help clarify how and why they might be used to address various research questions.

  18. Reality of LOG analytical tool of ATM teller machines movement actions%ATM取款机机芯动作LOG文件解析工具的实现

    Institute of Scientific and Technical Information of China (English)

    何惠英; 李纪红; 俞妍; 沈虹

    2013-01-01

    To reduce the demands to general maintenance personnels for bank self-service teller machines, it provides a log files analytical tool of ATM teller machine in Chinese based on Python language. By analyzing data in nearly 20,000 machines of 500G capacity with the analytical tool based on the method mentioned in this article, it can concluded that by using this method , the operating performance of the machines can be obtained fastly, accurately and detailedly, and machine failures can be excluded easily.%文中基于降低对银行自助取款机一般维护人员要求的目的,采用Python编程语言编写了取款机机芯运行记录文件的中文解析工具.通过近两万台500G容量的生产环境下机器运行数据分析,得出采用本文所述方法编写的解析工具可以快速、准确、详细地获得机器的运行性能并能据此排除机器故障的结论.

  19. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  20. Development of Time Management Tools for Project Oriented Engineering Education

    OpenAIRE

    Fabrés i Anglí, Josep

    2008-01-01

    This thesis wants to adapt the project time management tools and techniques in the project oriented engineering education. First of all it is studied the importance of the project time management beginning from the book “A Guide to the Project Management Body of Knowledge” (PMBOK® Guide). It is also investigated the increasing implementation of the projects oriented engineering education in the universities. To adapt the project time management tools and techniques, there are defined the s...

  1. Bayesian meta-analytical methods to incorporate multiple surrogate endpoints in drug development process.

    Science.gov (United States)

    Bujkiewicz, Sylwia; Thompson, John R; Riley, Richard D; Abrams, Keith R

    2016-03-30

    A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression. PMID:26530518

  2. Development of a multi-residue analytical method for TBBP-A and PBDEs in various biological matrices using unique reduced size sample

    Energy Technology Data Exchange (ETDEWEB)

    Andre, F.; Cariou, R.; Antignac, J.P.; Le Bizec, B. [Ecole Nationale Veterinaire de Nantes (FR). Laboratoire d' Etudes des Residus et Contaminants dans les Aliments (LABERCA); Debrauwer, L.; Zalko, D. [Institut National de Recherches Agronomiques (INRA), 31-Toulouse (France). UMR 1089 Xenobiotiques

    2004-09-15

    The impact of brominated flame retardants on the environment and their potential risk for animal and human health is a present time concern for the scientific community. Numerous studies related to the detection of tetrabromobisphenol A (TBBP-A) and polybrominated diphenylethers (PBDEs) have been developed over the last few years; they were mainly based on GC-ECD, GC-NCI-MS or GC-EI-HRMS, and recently GC-EI-MS/MS. The sample treatment is usually derived from the analytical methods used for dioxins, but recently some authors proposed the utilisation of solid phase extraction (SPE) cartridges. In this study, a new analytical strategy is presented for the multi-residue analysis of TBBP-A and PBDEs from a unique reduced size sample. The main objective of this analytical development is to be applied for background exposure assessment of French population groups to brominated flame retardants, for which, to our knowledge, no data exist. A second objective is to provide an efficient analytical tool to study the transfer of these contaminants through the environment to living organisms, including degradation reactions and metabolic biotransformations.

  3. Modeling decision making as a support tool for policy making on renewable energy development

    International Nuclear Information System (INIS)

    This paper presents the findings of a study on decision making models for the analysis of capital-risk investors’ preferences on biomass power plants projects. The aim of the work is to improve the support tools for policy makers in the field of renewable energy development. Analytic Network Process (ANP) helps to better understand capital-risk investors preferences towards different kinds of biomass fueled power plants. The results of the research allow public administration to better foresee the investors’ reaction to the incentive system, or to modify the incentive system to better drive investors’ decisions. Changing the incentive system is seen as major risk by investors. Therefore, public administration must design better and longer-term incentive systems, forecasting market reactions. For that, two scenarios have been designed, one showing a typical decision making process and another proposing an improved decision making scenario. A case study conducted in Italy has revealed that ANP allows understanding how capital-risk investors interpret the situation and make decisions when investing on biomass power plants; the differences between the interests of public administrations’s and promoters’, how decision making could be influenced by adding new decision criteria, and which case would be ranked best according to the decision models. - Highlights: • We applied ANP to the investors’ preferences on biomass power plants projects. • The aim is to improve the advising tools for renewable energy policy making. • A case study has been carried out with the help of two experts. • We designed two scenarios: decision making as it is and how could it be improved. • Results prove ANP is a fruitful tool enhancing participation and transparency

  4. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud...... client application that act as a bridge between software development tools and middleware platform....

  5. Nanopeptamers for the development of small-analyte lateral flow tests with a positive readout.

    Science.gov (United States)

    Vanrell, Lucía; Gonzalez-Techera, Andrés; Hammock, Bruce D; Gonzalez-Sapienza, Gualberto

    2013-01-15

    There is a great demand for rapid tests that can be used on-site for the detection of small analytes, such as pesticides, persistent organic pollutants, explosives, toxins, medicinal and abused drugs, hormones, etc. Dipsticks and lateral flow devices, which are simple and provide a visual readout, may be the answer, but the available technology for these compounds requires a competitive format that loses sensitivity and produces readings inversely proportional to the analyte concentration, which is counterintuitive and may lead to potential misinterpretation of the result. In this work, protein-multipeptide constructs composed of anti-immunocomplex peptides selected from phage libraries and streptavidin/avidin as core protein were used for direct detection of small compounds in a noncompetitive two-site immunoassay format that performs with increased sensitivity and positive readout. These constructs that we termed "nanopeptamers" allow the development of rapid point-of-use tests with a positive visual end point of easy interpretation. As proof of concept, lateral flow assays for the herbicides molinate and clomazone were developed and their performance was characterized with field samples.

  6. Conceptual air sparging decision tool in support of the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The enclosed document describes a conceptual decision tool (hereinafter, Tool) for determining applicability of and for optimizing air sparging systems. The Tool was developed by a multi-disciplinary team of internationally recognized experts in air sparging technology, lead by a group of project and task managers at Parsons Engineering Science, Inc. (Parsons ES). The team included Mr. Douglas Downey and Dr. Robert Hinchee of Parsons ES, Dr. Paul Johnson of Arizona State University, Dr. Richard Johnson of Oregon Graduate Institute, and Mr. Michael Marley of Envirogen, Inc. User Community Panel Review was coordinated by Dr. Robert Siegrist of Colorado School of Mines (also of Oak Ridge National Laboratory) and Dr. Thomas Brouns of Battelle/Pacific Northwest Laboratory. The Tool is intended to provide guidance to field practitioners and environmental managers for evaluating the applicability and optimization of air sparging as remedial action technique.

  7. Use of the analytical tree technique to develop a radiological protection program

    International Nuclear Information System (INIS)

    The results obtained by the Cuban Center for Radiological Protection and Hygiene by using an analytical tree technique to develop its general operational radiation protection program are presented. By the application of this method, some factors such as the organization of the radiation protection services, the provision of administrative requirements, the existing general laboratories requirements, the viability of resources and the current documentation was evaluated. Main components were considered such as: complete normative and regulatory documentation; automatic radiological protection data management; scope of 'on the-job'and radiological protection training for the personnel; previous radiological appraisal for the safety performance of the works and application of dose constrains for the personnel and the public. The detailed development of the program allowed to identify the basic aims to be achieved in its maintenance and improvement. (authors). 3 refs

  8. A Methodology to Develop Design Support Tools for Stand-alone Photovoltaic Systems in Developing Countries

    Directory of Open Access Journals (Sweden)

    Stefano Mandelli

    2014-08-01

    Full Text Available As pointed out in several analyses, Stand-Alone Photovoltaic systems may be a relevant option for rural electrification in Developing Countries. In this context, Micro and Small Enterprises which supply customized Stand-Alone Photovoltaic systems play a pivotal role in the last-mile-distribution of this technology. Nevertheless, a number of issues limit the development of these enterprises curbing also potential spinoff benefits. A common business bottleneck is the lack of technical skills since usually few people have the expertise to design and formulate estimates for customers. The long-term solution to tackle this issue implies the implementation of a capacity building process, but this solution rarely matches with time-to-market urgency of local enterprises. Therefore, we propose in this study a simple, but general methodology which can be used to set up Design Support Tools for Micro and Small Enterprises that supply Stand-Alone Photovoltaic systems in rural areas of Developing Countries. After a brief review of the techniques and commercial software available to design the targeted technology, we describe the methodology highlighting the structure, the sizing equations and the main features that should be considered in developing a Design Support Tool. Then, we apply the methodology to set up a tool for use in Uganda and we compare the results with two commercial codes (NSolVx and HOMER. The results show that the implemented Design Support Tool develops correct system designs and presents some advantages for being disseminated in rural areas. Indeed it supports the user in providing the input data, selecting the main system components and delivering estimates to customers.

  9. Developing a temperature sensitive tool for studying spin dissipation

    Science.gov (United States)

    Wickey, Kurtis Jon

    Measuring the thermodynamic properties of nanoscale structures is becoming increasingly important as heterostructures and devices shrink in size. For example, recent discoveries of spin thermal effects such as spin Seebeck and spin Peltier show that thermal gradients can manipulate spin systems and vice versa. However, the relevant interactions occur within a spin diffusion length of a spin active interface, making study of these spin thermal effects challenging. In addition, recent ferromagnetic resonance studies of spatially confined nanomagnets have shown unique magnon modes in arrays and lines which may give rise to unique magnon-phonon interactions. In this case, the small volume of magnetic material presents a challenge to measurement and as a result the bulk of the work is done on arrays with measurements of the magnetization of individual particles possible through various microscopies but limited access to thermal properties. As a result, tools capable of measuring the thermal properties of nanoscale structures are required to fully explore this emerging science. One approach to addressing this challenge is the use of microscale suspended platforms that maximize their sensitivity to these spin thermal interactions through thermal isolation from their surroundings. Combining this thermal decoupling with sensitive thermometry allows for the measurement of nanojoule heat accumulations, such as those resulting from the small heat flows associated with spin transport and spin relaxation. As these heat flows may manifest themselves in a variety of spin-thermal effects, the development of measurement platforms that can be tailored to optimize their sensitivity to specific thermal measurements is essential. To address these needs, I have fabricated thermally isolated platforms using a unique focused ion beam (FIB) machining that allow for flexible geometries as well as a wide choice of material systems. The thermal characteristics of these platforms were

  10. Development of analytical theory of the physical libration for a two-layer Moon

    Science.gov (United States)

    Petrova, Natalia; Barkin, Yurii; Gusev, Alexander; Ivanova, Tamara

    2010-05-01

    Investigation is being carried out in the frame of Russian-Japanese grant and directed onto providing of the future observations in the frame of the ILOM-project which is planned onto the end of the second decade. The analytical theory presents both scientific interest in its own right and can be useful as a base for the lunar annual in a future, as a clue to lunar interiors and to processes inside the lunar body. A comparison of the libration's analytical theory with new observations will allow to refine the parameters of lunar interiors: an existence or absence of a core, its size, composition and state of aggregation, Love numbers, qualitative parameter Q, etc. Contrary to the usual application of numerical libration models for analyses of observations, the analytical model is able to predict the new harmonics, early unknown and not observed (owing to the insufficient accuracy of observations) in libration's series of observations. As part of the investigation the following results were achieved. Development of the analytical theory of the Lunar Physical Libration (LPhL) were performed using the Poisson Series Processor (PSP). The base solution is realized for the 'main problem' of the LPhL in view of 4-th harmonic of selenopotential. Data on a dynamical figure of the Moon are incorporated in the theory on the basis of new observations of the Lunar gravitational field, received in a frame of space projects Clementine (1994, NASA), Lunar Prospector (1999, NASA) and the SELENE (2007 - 2009, Japan). On the basis of the constructed theory the following actions were done: 1) analyses of the present dynamical models; 2) modeling of stars trajectories in the field of view of the future optical telescope, which is planned to be placed on one of the Lunar poles in the second stage of the Japanese project SELENE-B - ILOM. Results of modeling have shown opportunities of determination of LPhL-parameters with the desirable accuracy 0.001 arc seconds planned in the ILOM

  11. Model Analytical Development for Physical, Chemical, and Biological Characterization of Momordica charantia Vegetable Drug.

    Science.gov (United States)

    Brandão, Deysiane Oliveira; Guimarães, Geovani Pereira; Santos, Ravely Lucena; Júnior, Fernando José de Lima Ramos; da Silva, Karla Monik Alves; de Souza, Fabio Santos; Macêdo, Rui Oliveira

    2016-01-01

    Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG) and differential thermal analysis (DTA), pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS), and nuclear magnetic resonance ((1)H NMR), in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity. PMID:27579215

  12. Development of Analytical Approach to Evaluate (DiffServ-MIPv6 Scheme

    Directory of Open Access Journals (Sweden)

    Loay F. Hussien

    2014-03-01

    Full Text Available The aspiration of Mobile IPv6 is to provide uninterrupted network connectivity while the mobile node is moving between different access points or domains. Nonetheless, it does not provide QoS guaranteed to its users same as the traditional Internet protocol IP. It merely can provide Best-Effort (BE service to all its applications despite of the application requirements. The future wireless network would be based on IPv6 to provide services to Internet mobile users. Hence, one of main requirements of next generation IP based networks is providing QoS for real-time traffic that will be transporting through MIPv6 networks. This study presents the analytical analysis for the previously proposed scheme (DiffServ-MIPv6 that applies the DiffServ platform to Mobile IPv6 network in order to suit the needs of both QoS guaranteed and mobility in communication. The analytical evaluation is developed to assess the performance of the proposed scheme (DiffServ-MIPv6 compared to the standard MIPv6 protocol in terms of signaling cost. The signaling cost is measured against two factors session-to-mobility ratio and packet arrival rate.

  13. Descriptors of Modular Formation of Accounting and Analytical Cluster in Innovation Development of Agricultural Holdings

    Directory of Open Access Journals (Sweden)

    Zhanna Degaltseva

    2015-11-01

    Full Text Available In the context of the division of accounting into financial accounting, taxation accounting, management accounting and statistical accounting a problem of improving their relationship arises. The accounting and analytical cluster plays the role of the correlating factor of the relationship between the subdivisions of the agricultural holding property. To improve its work a modular principle of its building based on information technology was introduced. Practical implementation of modular accounting and analytical cluster revealed its shortcomings. They were as follows: each type of account and each production unit used its own natural and cost parameters. The number and nature of these parameters were different. To eliminate the shortcomings in the information security of managers and specialists of the agricultural holding, we attempt to develop a methodology for establishing a rational number of descriptors (binding parameters for each module. The proposed descriptors are designed on the basis of validity of the methodological approach to their calculations and legal support. The proposed method allowed to limit the asymmetric information in all kinds of records, to improve its quality and to bring a synergistic effect from the scale and structure of the use of the agricultural holdings property complex.

  14. Model Analytical Development for Physical, Chemical, and Biological Characterization of Momordica charantia Vegetable Drug

    Science.gov (United States)

    Guimarães, Geovani Pereira; Santos, Ravely Lucena; Júnior, Fernando José de Lima Ramos; da Silva, Karla Monik Alves; de Souza, Fabio Santos

    2016-01-01

    Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG) and differential thermal analysis (DTA), pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS), and nuclear magnetic resonance (1H NMR), in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity.

  15. Cognitive and Social Constructivism: Developing Tools for an Effective Classroom

    Science.gov (United States)

    Powell, Katherine C.; Kalina, Cody J.

    2009-01-01

    An effective classroom, where teachers and students are communicating optimally, is dependent on using constructivist strategies, tools and practices. There are two major types of constructivism in the classroom: (1) Cognitive or individual constructivism depending on Piaget's theory, and (2) Social constructivism depending on Vygotsky's theory.…

  16. INL Review of Fueling Machine Inspection Tool Development Proposal

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    A review of a technical proposal for James Fischer Nuclear. The document describes an inspection tool to examine the graphite moderator in an AGR reactor. The system is an optical system to look at the graphite blocks for cracks. INL reviews the document for technical value.

  17. The development of a partnering assessment tool for projects

    NARCIS (Netherlands)

    Holkers, A.; Voordijk, J.T.; Greenwood, D.

    2008-01-01

    Many firms in the construction industry claim to be working in a ‘partnering’ or even in an ‘integrated’ way. It is, however, very difficult to verify these claims with the tools currently available. The purpose of this study was to collect and refine existing work on integrative and collaborative w

  18. Developing a MATLAB(registered)-Based Tool for Visualization and Transformation

    Science.gov (United States)

    Anderton, Blake J.

    2003-01-01

    An important step in the structural design and development of spacecraft is the experimental identification of a structure s modal characteristics, such as its natural frequencies and modes of vibration. These characteristics are vital to developing a representative model of any given structure or analyzing the range of input frequencies that can be handled by a particular structure. When setting up such a representative model of a structure, careful measurements using precision equipment (such as accelerometers and instrumented hammers) must be made on many individual points of the structure in question. The coordinate location of each data point is used to construct a wireframe geometric model of the structure. Response measurements obtained from the accelerometers is used to generate the modal shapes of the particular structure. Graphically, this is displayed as a combination of the ways a structure will ideally respond to a specified force input. Two types of models of the tested structure are often used in modal analysis: an analytic model showing expected behavior of the structure, and an experimental model showing measured results due to observed phenomena. To evaluate the results from the experimental model, a comparison of analytic and experimental results must be made between the two models. However, comparisons between these two models become difficult when the two coordinate orientations differ in a manner such that results are displayed in an unclear fashion. Such a problem proposes the need for a tool that not only communicates a graphical image of a structure s wireframe geometry based on various measurement locations (called nodes), but also allows for a type of transformation of the image s coordinate geometry so that a model s coordinate orientation is made to match the orientation of another model. Such a tool should also be designed so that it is able to construct coordinate geometry based on many different listings of node locations and is able

  19. Development of MWL-AUC / CCD-C-AUC / SLS-AUC detectors for the analytical ultracentrifuge

    OpenAIRE

    Karabudak, Engin

    2009-01-01

    Analytical ultracentrifugation (AUC) has made an important contribution to polymer and particle characterization since its invention by Svedberg (Svedberg and Nichols 1923; Svedberg and Pederson 1940) in 1923. In 1926, Svedberg won the Nobel price for his scientific work on disperse systems including work with AUC. The first important discovery performed with AUC was to show the existence of macromolecules. Since that time AUC has become an important tool to study polymers in biophysics and b...

  20. Residue-specific radioimmunoanalysis: a novel analytical tool. Application to the C-terminus of CCK/gastrin peptides

    Energy Technology Data Exchange (ETDEWEB)

    Rehfeld, J.F. (Rigshospitalet, Copenhagen (Denmark)); Morley, J.S. (Imperial Chemical Industries Ltd., Alderley Park (UK). Pharmaceutical Div.)

    1983-02-01

    Five antisera directed against the common bioactive C-terminal tetrapeptide sequence of cholecystokinin (CCK) and gastrin were examined with respect to the significance of each residue for the antibody binding. Systematic substitutions and/or derivatizations of each of the four residues showed a unique pattern for each antiserum although they were raised against the same antigen and have the same sequence-specificity. The pattern of reactivity towards the related cardioexcitatory FMRF amide peptide and analogues hereof confirmed the residue specificity of the antisera. While it is well known that even small covalent modifications of the antigen can influence the antibody binding profoundly, the great variations in significance of each residue among randomly selected antisera raised against the same antigen and specific for the same sequence has not been known so far. Hence, by appropriate combination of antisera their different residue specificity can be used for detection of amino acid substitutions or modifications. Such immunochemical sequence analysis requires only femto- or picomolar amounts of peptides, which need not necessarily be purified. Thus, residue-specific immunoanalysis may be a versatile tool in studies of species differences, phylogenesis and synthesis of peptides.

  1. Rethinking the Role of Information Technology-Based Research Tools in Students' Development of Scientific Literacy

    Science.gov (United States)

    van Eijck, Michiel; Roth, Wolff-Michael

    2007-01-01

    Given the central place IT-based research tools take in scientific research, the marginal role such tools currently play in science curricula is dissatisfying from the perspective of making students scientifically literate. To appropriately frame the role of IT-based research tools in science curricula, we propose a framework that is developed to…

  2. Improving students’ understanding of quantum measurement. II. Development of research-based learning tools

    Directory of Open Access Journals (Sweden)

    Guangtian Zhu1,2

    2012-04-01

    Full Text Available We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students’ common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students’ understanding of concepts related to quantum measurement.

  3. Improving Students' Understanding of Quantum Measurement Part 2: Development of Research-based Learning Tools

    CERN Document Server

    Zhu, Guangtian

    2016-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials (QuILTs) and peer instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students' understanding of concepts related to quantum measurement.

  4. An Accelerated Analytical Process for the Development of STR Profiles for Casework Samples.

    Science.gov (United States)

    Laurin, Nancy; Frégeau, Chantal J

    2015-07-01

    Significant efforts are being devoted to the development of methods enabling rapid generation of short tandem repeat (STR) profiles in order to reduce turnaround times for the delivery of human identification results from biological evidence. Some of the proposed solutions are still costly and low throughput. This study describes the optimization of an analytical process enabling the generation of complete STR profiles (single-source or mixed profiles) for human identification in approximately 5 h. This accelerated process uses currently available reagents and standard laboratory equipment. It includes a 30-min lysis step, a 27-min DNA extraction using the Promega Maxwell(®) 16 System, DNA quantification in profiles on the 3500-series Genetic Analyzer. This combination of fast individual steps produces high-quality profiling results and offers a cost-effective alternative approach to rapid DNA analysis. PMID:25782346

  5. Analytical developments in thermal ionization mass spectrometry for the isotopic analysis of very small amounts

    International Nuclear Information System (INIS)

    In the framework of the French transmutation project of nuclear wastes, experiments consisted in the irradiation in a fast neutron reactor of few milligrams of isotopically enriched powders. Hence, the isotopic analysis of very small amount of irradiation products is one of the main issues. The aim of this study was to achieve analytical developments in thermal ionization mass spectrometry in order to accurately analyze these samples. Several axes were studied including the new total evaporation method, deposition techniques, electron multiplier potentialities and comparison between different isotope measurement techniques. Results showed that it was possible to drastically decrease the amounts needed for analysis, especially with Eu and Nd, while maintaining an uncertainty level in agreement with the project requirements. (author)

  6. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    Science.gov (United States)

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool. PMID:27435199

  7. Developing and implementing an oral care policy and assessment tool.

    LENUS (Irish Health Repository)

    Stout, Michelle

    2012-01-09

    Oral hygiene is an essential aspect of nursing care. Poor oral care results in patients experiencing pain and discomfort, puts individuals at risk of nutritional deficiency and infection, and has an adverse effect on quality of life. This article describes how an oral care policy and assessment tool were updated to ensure the implementation of evidence-based practice at one hospital in the Republic of Ireland.

  8. Nuclear forensics: From specialized analytical measurements to a fully developed discipline in science

    International Nuclear Information System (INIS)

    Nuclear forensic science aims at providing clues on nuclear or other radioactive material involved in illicit incidents. A considerable number of cases of illicit trafficking have been reported to the IAEA Illicit Trafficking Database, underlining the need for analytical and interpretation capabilities as well as for close international collaboration. Credible nuclear forensics can only be achieved if all evidence and case history are preserved and made available for data interpretation and source attribution. Hence, nuclear forensics investigations have to start at the 'crime scene'. As a consequence, a comprehensive response plan is required, clearly describing the responsibilities of the authorities involved and the role of the individual actors. Full nuclear forensics capabilities are only available in a few specialized laboratories. The Institute for Transuranium Elements (ITU) has established collaboration schemes with European Union member States and also provides nuclear forensics support to other countries that request it. This nuclear forensics support was tested by a number of the new European Union member States, when seized material was subject to joint analyses using the analytical infrastructure at ITU. Nuclear forensics remains a discipline challenging the capabilities of the analysts involved in the case investigations. Information on the origin of the nuclear material is inherent to the samples. Reading and understanding this information has, to a large extent, been established and appropriate laboratory protocols have been developed, validated and tested. Further research activities focus on the application of classical forensic methods to contaminated evidence. Emphasis was given to the two most prominent forensic techniques: taking of fingerprints and DNA analysis. In addition to the conceptual and operational developments, appropriate training has been provided to the authorities involved. The experience gained in joint nuclear forensic

  9. Methods and tools in development in the field of maintenance

    International Nuclear Information System (INIS)

    This article is dedicated to the maintenance strategy followed by EDF for its nuclear power plants. This strategy is based on the American method RCM (reliability centered method) that has been extended to passive components such as structures and relies on 4 axis. The first axis can be defined as the optimization of maintenance with regard to safety, outage and costs. The aim is to set the border between preventive maintenance and remedial maintenance. The second axis is the condition based maintenance that is the maintenance action that is triggered by the real state of the equipment. This type of maintenance implies the search for reliable and accurate diagnostics. The third axis is the design of simulation and planning tools required for the management of the moving of tools, machinery and spare-parts on a maintenance work site. The fourth axis is the design of tools able to repair some important equipment, this way may be a valid alternative to the replacement of equipment in terms of cost and outage time. (A.C.)

  10. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed and ...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  11. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  12. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  13. The development of a tongue assessment tool to assist with tongue-tie identification

    OpenAIRE

    Ingram, Jenny; Johnson, Debbie; Copeland, Marion; Churchill, Cathy; Taylor, Hazel; Emond, Alan

    2015-01-01

    Aim To produce a simple tool with good transferability to provide a consistent assessment of tongue appearance and function in infants with tongue-tie. Methods The Bristol Tongue Assessment Tool (BTAT) was developed based on clinical practice and with reference to the Hazelbaker Assessment Tool for Lingual Frenulum Function (ATLFF). This paper documents 224 tongue assessments using the BTAT. There were 126 tongue assessments recorded using the BTAT and ATLFF tools to facilitate comparisons be...

  14. Analytic method development to quantify by in-situ gamma spectrometry radionuclides in the ground

    International Nuclear Information System (INIS)

    This research thesis reports the development of an analytic method based on a Monte Carlo simulation to quantify radionuclides present in soils by means of in-situ gamma spectrometry, to understand physical phenomena involved before and after detection, and to improve and complement results after spectrum analysis. The first part describes the evolution of in-situ gamma spectrometry: sensor development, in-situ measurement principle, evolution of the analysis principle. The second part introduces the Monte Carlo simulation and describes the used models (sensor model using the stripping method, development of a new simulation model for the incident flow). The third part discusses the understanding of an in-situ spectrum with the localization of the origin of incident photons and the identification of measurement parameters. Modelling results are then presented, as well as the development of spectrum de-convolution method, and the calculation of dose factors. Finally, the use of the 'Peak-to-Valley' method completed by the Monte Carlo simulation results is explained and used to localize a source depth and to define the exponential distribution of Cs-137 in Orsay

  15. Instrumental neutron activation analysis as an analytical tool supporting the establishment of guidelines and databases for workers' health awareness programmes

    International Nuclear Information System (INIS)

    dangerous diseases that are easily identified. The main problem is that the majority of workers are exposed to low levels of toxic chemicals that can be lethal in the long term, owing to chronic diseases. Most often the onset of the diseases goes unnoticed, and the presence of a lung cancer or heart disease is attributed to non-occupational causes. As a result, these cases of illness do not become part of the compiled data. Besides, there is no specialized and complete literature concerning occupational aetiology, nor is there an evaluation of the onset of disease linked to long term exposure to low levels of toxic agents. With the aim of giving support to the Workers' Health Awareness Programme of the Secretaria Municipal de Saude (Municipal Department of Health) of Belo Horizonte, capital of Minas Gerais state, an assessment was done in galvanizing factories by means of airborne particulate matter collected in air filters and in hair and toenails as biomonitors. This project was approved by the Ethics Committee of the Federal University of Minas Gerais, COEP-UFMG. All research involving human beings has to be submitted to this committee in order to protect the population studied. The k0 instrumental neutron activation analysis (INAA) technique was chosen to be applied for the determination of elements in air filters and in hair and toenail samples, as it can determine several elements in the same sample almost simultaneously, with a low detection limit and without any chemical procedure as is required in the majority of non-nuclear techniques, and also because it requires only a small amount of sample for the analysis. The k0 method was applied to all samples, demonstrating its quality as a versatile technique, and it was confirmed to be one of the most advantageous and suitable nuclear analytical techniques

  16. Development and pilot testing of a vitiligo screening tool.

    Science.gov (United States)

    Sheth, Vaneeta M; Gunasekera, Nicole S; Silwal, Sujeeta; Qureshi, Abrar A

    2015-01-01

    Studies aimed at understanding the pathology, genetics, and therapeutic response of vitiligo rely on asking a single question about 'physician-diagnosed' vitiligo on surveys to identify subjects for research. However, this type of self-reporting is not sufficient. Our objective was to determine if the patient-administered Vitiligo Screening Tool (VISTO) is a sensitive and specific instrument for the detection of vitiligo in an adult population. The VISTO consists of eight closed-ended questions to assess whether the survey participant has ever been diagnosed with vitiligo by a healthcare worker and uses characteristic pictures and descriptions to inquire about the subtype and extent of any skin lesions. 159 patients at the Brigham and Women's Hospital dermatology clinic with or without a diagnosis of vitiligo were recruited. A board-certified dermatologist confirmed or excluded the diagnosis of vitiligo in each subject. 147 completed questionnaires were analyzed, 47 cases and 100 controls. The pictorial question showed 97.9% sensitivity and 98% specificity for diagnosis of vitiligo. Answering "yes" to being diagnosed with vitiligo by a dermatologist and choosing one photographic representation of vitiligo showed 95.2% sensitivity and 100% specificity for diagnosis of vitiligo. We conclude that VISTO is a highly sensitive and specific, low-burden, self-administered tool for identifying vitiligo among adult English speakers. We believe this tool will provide a simple, cost-effective way to confirm vitiligo prior to enrollment in clinical trials as well as for gathering large-scale epidemiologic data in remote populations. Future work to refine the VISTO is needed prior to use in genotype-phenotype correlation studies.

  17. Digital design and communication tools for sustainable development

    Energy Technology Data Exchange (ETDEWEB)

    Totten, M.

    1995-12-31

    Within the computer and communications industry there is a strong sentiment that the speed and power of mainframe computers will be available at personal computer sizes and prices in the next few years. Coinciding with this is the expectation that large data/information/knowledge resource pools will be available online for download. This paper summarizes what is available now and what is coming in the future in computer technologies. Then the author talks the opportunities in `green` building design for energy efficiency and conservation and the type of design tools which will be coming in the future.

  18. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method

  19. Developing and Stabilizing Analytical Method for the Determination of Uranium in Water Systems using Voltammeters

    International Nuclear Information System (INIS)

    Uranium is a precious metal found in all over the world in trace amounts. The water flowing over the surface dissolves the uranium from surface and rocks coming in contact with it. To determine the uranium concentration in water we emphasized for the development of an analytical method. The development of technique for the uranium determination in water is based on Chloranilic acid (CAA) and Hanging Mercury Drop Electrode (HMDE). CAA is a water soluble compound forms a complex with uranium, which was collected by the physical adsorption of the complex on the electrode. The standard uranium solution was used for calibration. Below pH 2 and above pH 3, the standard deviation exceeded 1 micro g/L. So, the pH for the samples was maintained between 2.3 and 3. The accuracy of the method was established by recovery studies in control samples. Then the developed technique was applied for uranium determination in sea water. Different surfactants were used to minimize the effects of interfering radicals. The effects of potential variation were also examined. The best results were obtained for Sodium Dodecyl Sulfate (SDS). Some problems still existing can be eliminated by the use of inert anti foaming agent. (author)

  20. Stabilization Columns for Embankment Support - Investigation, Verification and Further Development of Analytical Analyses

    Science.gov (United States)

    Pankrath, H.; Kaya, H.; Thiele, R.

    2015-09-01

    As a technical and economical alternative to foundations on piles, but also to shallow foundations on improved soil, in recent decades a high number of soil improvement methods have been developed and established. Many of these methods use non-reinforced, cylindrical load bearing elements. A very common application of stabilizing columns is the improvement of a few meters thick soft soils below dams and embankments. But especially for this application, many failure cases are documented worldwide. In the contribution the substantial content and results are presented for investigation, testing and further development of methods for evaluating the slope stability. After a description of the problem and consequential tasks the contribution contains main results of the investigations of international sources with the stepwise development of analytical solutions. Next to the in practice well- known approaches for gravel columns, less common approaches from Scandinavia are explained. The contribution is completed with a presentation and discussion of an illustrative example, taking into account a number of different failure modes of the columns and the surrounding soil. The example was compared and validated with a 3D Model using the Finite Element Method.

  1. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    DEFF Research Database (Denmark)

    Kaspersen, Per Skougaard; Halsnæs, Kirsten; Gregg, Jay Sterling;

    Council. The flood hazard maps presented in this report constitute the first preliminary results of on-going methodological and analysis development in mapping potential impacts in relation to flooding from extreme precipitation in the city of Aarhus. For all purposes the Aarhus flood maps presented...... in this report should be considered work-in-progress. The analysis was conducted by DHI as part of the DSF project Centre for Regional Change of the Earth System (CRES)....

  2. Development of analytically capable time-of-flight mass spectrometer with continuous ion introduction.

    Science.gov (United States)

    Hárs, György; Dobos, Gábor

    2010-03-01

    The present article describes the results and findings explored in the course of the development of the analytically capable prototype of continuous time-of-flight (CTOF) mass spectrometer. Currently marketed pulsed TOF (PTOF) instruments use ion introduction with a 10 ns or so pulse width, followed by a waiting period roughly 100 micros. Accordingly, the sample is under excitation in 10(-4) part of the total measuring time. This very low duty cycle severely limits the sensitivity of the PTOF method. A possible approach to deal with this problem is to use linear sinusoidal dual modulation technique (CTOF) as described in this article. This way the sensitivity of the method is increased, due to the 50% duty cycle of the excitation. All other types of TOF spectrometer use secondary electron multiplier (SEM) for detection, which unfortunately discriminates in amplification in favor of the lighter ions. This discrimination effect is especially undesirable in a mass spectrometric method, which targets high mass range. In CTOF method, SEM is replaced with Faraday cup detector, thus eliminating the mass discrimination effect. Omitting SEM is made possible by the high ion intensity and the very slow ion detection with some hundred hertz detection bandwidth. The electrometer electronics of the Faraday cup detector operates with amplification 10(10) V/A. The primary ion beam is highly monoenergetic due to the construction of the ion gun, which made possible to omit any electrostatic mirror configuration for bunching the ions. The measurement is controlled by a personal computer and the intelligent signal generator Type Tabor WW 2571, which uses the direct digital synthesis technique for making arbitrary wave forms. The data are collected by a Labjack interface board, and the fast Fourier transformation is performed by the software. Noble gas mixture has been used to test the analytical capabilities of the prototype setup. Measurement presented proves the results of the

  3. Development and Testing of an Optimised Combined Analytical Instrument for Planetary Applications

    Science.gov (United States)

    Lerman, Hannah; Hutchinson, Ian

    2016-10-01

    Miniaturised, analytical instruments that can simultaneously obtain complementary (molecular and elemental) information about the composition of a sample are likely to be a key feature of the next generation of planetary exploration missions. Certain spectroscopic techniques, such as Raman spectroscopy, can provide information on the molecular composition of an unknown sample whereas others, such as Laser-Induced Breakdown Spectroscopy (LIBS) and X-Ray Fluorescence (XRF), enable the determination of the elemental composition of a material. Combining two or more of these techniques into one instrument package enables a broader range of the scientific goals of a particular mission to be obtained (i.e. full composition analysis and structural information about the sample and therefore geological history). In order to determine the most appropriate design for such an instrument, we have developed some radiometric models to assess the overall scientific capability of various analytical technique combinations. We have then used these models to perform a number of trade-offs to evaluate the optimum instrument design for a particular set of science requirements (such as, to acquire composition information with suitable sensitivity and uncertainty). The performance of one of these designs was then thoroughly investigated by building a prototype instrument. The construction of our instrument focuses on the optimum design for combining the multiple instrument sub-systems so that the overall mass, power and cost budgets can be minimised, whilst achieving the wider and more comprehensive range of scientific goals. Here we report on measurements obtained from field test campaigns that have been performed in order to verify model predictions and overall scientific performance. These tests include operation in extreme environments such as dry deserts and under water.

  4. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations.

    Science.gov (United States)

    Koch, Nicholas C; Newhauser, Wayne D

    2010-02-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  5. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  6. Developing theological tools for a strategic engagement with Human Enhancement.

    Science.gov (United States)

    Tomkins, Justin

    2014-01-01

    The literature on Human Enhancement may indeed have reached a critical mass yet theological engagement with the subject is still thin. Human Enhancement has already been established as a key topic within research and captivating visions of the future have been allied with a depth of philosophical analysis. Some Transhumanists have pointed to a theological dimension to their position and some who have warned against enhancement might be seen as having done so from a perspective shaped by a Judeo-Christian worldview. Nonetheless, in neither of these cases has theology been central to engagement with the enhancement quest.Christian theologians who have begun to open up such an engagement with Human Enhancement include Brent Waters, Robert Song and Celia Deane-Drummond. The work they have already carried out is insightful and important yet due to the scale of the possible engagement, the wealth of Christian theology which might be applied to Human Enhancement remains largely untapped. This paper explores how three key aspects of Christian theology, eschatology, love of God and love of neighbour, provide valuable tools for a theological engagement with Human Enhancement. It is proposed that such theological tools need to be applied to Human Enhancement if the debate is to be resourced with the Christian theological perspective of what it means to be human in our contemporary technological context and if society is to have the choice of maintaining its Christian foundations. PMID:25344011

  7. Developing new serious games tools to improve radiation protection

    International Nuclear Information System (INIS)

    In this paper, novel software technologies for simulation and training of workers in radiologically dangerous conditions are presented. Such new software tools enable the radiation protection managers and workers to better evaluate, visualize and intuitively understand the radiation situation. In the first part of the paper, virtual reality planning tool ALPLANNER is introduced. ALPLANNER enables computation of worker's doses and 3D simulation of planned activities in the environment. In the second part of the paper , a software technology SP ACEVISION for real-time interactive 3D visualization of radioactivity is presented. Radiation fields can be spatially and dynamically visualized in the environment using computer games technologies. Such real-time visualization can be used by RP staff to compute and visualize direct responses of the radiation field to the effects of shielding. Another presented application is determination and visualization of activity sources in inhomogeneous radiation fields. Practical example of how the mentioned software technologies are used during the decommissioning of NPP A-1 Jaslovske Bohunice is provided. Practical example of how the mentioned software technologies are used during the decommissioning of NPP A-1 Jaslovske Bohunice is provided. (authors)

  8. Geo-Sandbox: An Interactive Geoscience Training Tool with Analytics to Better Understand Student Problem Solving Approaches

    Science.gov (United States)

    Butt, N.; Pidlisecky, A.; Ganshorn, H.; Cockett, R.

    2015-12-01

    The software company 3 Point Science has developed three interactive learning programs designed to teach, test and practice visualization skills and geoscience concepts. A study was conducted with 21 geoscience students at the University of Calgary who participated in 2 hour sessions of software interaction and written pre and post-tests. Computer and SMART touch table interfaces were used to analyze user interaction, problem solving methods and visualization skills. By understanding and pinpointing user problem solving methods it is possible to reconstruct viewpoints and thought processes. This could allow us to give personalized feedback in real time, informing the user of problem solving tips and possible misconceptions.

  9. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  10. Secure Scrum During the Development of a Configuration Tool for Generic Workflow

    OpenAIRE

    Paulsson, Joel; Westberg, Charlotta

    2010-01-01

    Secure Scrum is a framework that integrates security into Scrum. In this thesis Secure Scrum has been evaluated in the development environment at Medius. An aim for the thesis was to implement a configuration tool for the module Generic Workflow in Medius’ product MediusFlowTM. In order to evaluate Secure Scrum, this framework has been used during the development of the configuration tool. Before this thesis began a configuration tool existed but it was Medius’ wish that this configuration to...

  11. From Safety Culture to Safety Orientation - Developing a tool to measure safety in shipping

    OpenAIRE

    Håvold, Jon Ivar

    2007-01-01

    From Safety Culture to Safety Orientation.Developing a tool to measure safety in shippingThis study intend to develop a tool to measure safety orientation (SO) in shipping. SO should be considered a practical safety culture assessment instrument, indicating the degree of orientation towards safety in a group or an organisation. The scale can for example be used in benchmarking as a key performance indicator (KPI), or as an indicator in a balanced scorecard type of management tool.The definiti...

  12. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    Directory of Open Access Journals (Sweden)

    Lakshmi Narayana Suvarapu

    2015-01-01

    Full Text Available This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed.

  13. The influence of the sample matrix on LC-MS/MS method development and analytical performance

    NARCIS (Netherlands)

    Koster, Remco Arjan

    2015-01-01

    In order to provide personalized patient treatment, a large number of analytical procedures is needed to measure a large variety of drugs in various human matrices. The analytical technique used for this research is Liquid Chromatography coupled with triple quadrupole mass spectrometry (LC-MS/MS). E

  14. Using Multilingual Analytics to Explore the Usage of a Learning Portal in Developing Countries

    Science.gov (United States)

    Protonotarios, Vassilis; Stoitsis, Giannis; Kastrantas, Kostas; Sanchez-Alonso, Salvador

    2013-01-01

    Learning analytics is a domain that has been constantly evolving throughout recent years due to the acknowledgement of its importance by those using intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning [1]. Learning analytics may be applied in…

  15. Development of analytical methods for polycyclic aromatic hydrocarbons (PAHs) in airborne particulates:A review

    Institute of Scientific and Technical Information of China (English)

    LIU Li-bin; LIU Yan; LIN Jin-ming; TANG Ning; HAYAKAWA Kazuichi; MAEDA Tsuneaki

    2007-01-01

    In the present work,the different sample collection, pretreatment and analytical methods for polycyclic aromatic hydrocarbons (PAHs) in airborne particulates is systematacially reviewed, and the applications of these pretreatment and analytical methods for PAHs are compared in detail. Some comments on the future expectation are also presented.

  16. Development of sensitive analytical technique by Laser-Induced Photoacoustic Spectroscopy

    International Nuclear Information System (INIS)

    A LIPAS (Laser-Induced Photoacoustic Spectroscopy) system has been developed for sensitive and remote analysis of neptunium which diffuse in low concentration range in reprocessing. The correction technique of background which disturbs sensitive analysis has been studied in visible to infrared range. And optical fiber system which is important for light operation has been also investigated for remote analysis in PUREX process. In visible range, the double-cell system, which has two photoacoustic cells in series, has been studied. The detection limit absorptivity was 4.47 x 10-5cm-1, this system has two orders higher sensitivity than that of absorption spectroscopy. This system was applied to measure photoacoustic spectrum of Pr(III), Nd(III), Er(III) and Np(V) in low concentration range in water. On photoacoustic spectrum of Np(V), the absorption peak at 614nm, which was not observed in low pH range, was identified. In near infrared range, analytical system which has parallel cells using alexandrite laser has been investigated. It was obtained that detection limit concentration of Np(V) is one order lower than that in visible range. The optical fiber system for application of LIPAS to reprocessing has been examined. The sensitivity of fiber-PAS is two times higher than that of absorption spectroscopy. However it is necessary to develop a beam operation system and a photoacoustic cell optimized for optical fiber system. (author)

  17. Development of analytical competencies and professional identities through school-based learning in Denmark

    Science.gov (United States)

    Andresen, Bent B.

    2015-12-01

    This article presents the main results of a case study on teachers' professional development in terms of competence and identity. The teachers involved in the study are allocated time by their schools to participate in professional "affinity group" meetings. During these meetings, the teachers gather and analyse school-based data about factors which persistently create and sustain challenges in effective student education (grade K-10). This process improves their understanding and undertaking of job-related tasks. The affinity group meetings also influence the teachers' professional identity. The research findings thus illustrate the fact that the analytical approach of affinity groups, based on the analysis of the difficulties in their daily job, provides good results in terms of competencies and identity perception. In general, as a result of meeting in affinity groups, adult learners develop professional competencies and identities which are considered crucial in rapidly changing schools characterised by an increased focus on, among other things, lifelong learning, social inclusion, school digitalisation, and information literacy. The research findings are thus relevant for ministries and school owners, teacher-trainers and supervisors, schools and other educational institutions, as well as teachers and their organisations worldwide.

  18. A Hybrid Fuzzy Analytic Network Process Approach to the New Product Development Selection Problem

    Directory of Open Access Journals (Sweden)

    Chiuh-Cheng Chyu

    2014-01-01

    Full Text Available New product development selection is a complex decision-making process. To uphold their competence in competitive business environments, enterprises are required to continuously introduce novel products into markets. This paper presents a fuzzy analytic network process (FANP for solving the product development selection problem. The fuzzy set theory is adopted to represent ambiguities and vagueness involved in each expert’s judgment. In the proposed model, the fuzzy Kano method and fuzzy DEMATEL are employed to filter criteria and establish interactions among the criteria, whereas the SAM is applied to aggregate experts’ opinions. Unlike the commonly used top-down relation-structuring approach, the proposed FANP first identifies the interdependence among the criteria and then the identified relationships are mapped to the clusters. This approach is more realistic, since the inner and outer relationships between criteria are simultaneously considered to establish the relationships among clusters. The proposed model is illustrated through a real life example, with a comparative analysis using modified TOPSIS and gray relation analysis in the synthesizing phase. The concluded results were approved by the case company. The proposed methodology not only is useful in the case study, but also can be generally applied in other similar decision situations.

  19. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    Science.gov (United States)

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  20. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    Science.gov (United States)

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-01-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a…