WorldWideScience

Sample records for valuable analytical tool

  1. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  2. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  3. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  4. The AACSB: A Valuable Tool for the Language Educator.

    Science.gov (United States)

    Bush-Bacelis, Jean L.

    The American Assembly of Collegiate Schools of Business (AACSB), an accrediting agency, may be an overlooked tool for establishing rationale and credibility for globalization of business courses. The 245 member institutions are bound by the agency's accrediting requirements, and many others are influenced by the standards set in those…

  5. Teaching resources in speleology and karst: a valuable educational tool

    Directory of Open Access Journals (Sweden)

    De Waele Jo

    2010-01-01

    Full Text Available There is a growing need in the speleological community of tools that make teaching of speleology and karst much easier. Despite the existence of a wide range of major academic textbooks, often the caver community has a difficult access to such material. Therefore, to fill this gap, the Italian Speleological Society, under the umbrella of the Union International de Spéléologie, has prepared a set of lectures, in a presentation format, on several topics including geology, physics, chemistry, hydrogeology, mineralogy, palaeontology, biology, microbiology, history, archaeology, artificial caves, documentation, etc. These lectures constitute the “Teaching Resources in Speleology and Karst”, available online. This educational tool, thanks to its easily manageable format, can constantly be updated and enriched with new contents and topics.

  6. The program success story: a valuable tool for program evaluation.

    Science.gov (United States)

    Lavinghouze, Rene; Price, Ann Webb; Smith, Kisha-Ann

    2007-10-01

    Success stories are evaluation tools that have been used by professionals across disciplines for quite some time. They are also proving to be useful in promoting health programs and their accomplishments. The increasing popularity of success stories is due to the innovative and effective way that they increase a program's visibility, while engaging potential participants, partners, and funders in public health efforts. From the community level to the federal level, program administrators are using success stories as vehicles for celebrating achievements, sharing challenges, and communicating lessons learned. Success stories are an effective means to move beyond the numbers and connect to readers-with a cause they can relate to and want to join. This article defines success stories and provides an overview of several types of story formats, how success stories can be systematically collected, and how they are used to communicate program success.

  7. Hypnosis as a Valuable Tool for Surgical Procedures in the Oral and Maxillofacial Area.

    Science.gov (United States)

    Montenegro, Gil; Alves, Luiza; Zaninotto, Ana Luiza; Falcão, Denise Pinheiro; de Amorim, Rivadávio Fernandes Batista

    2017-04-01

    Hypnosis is a valuable tool in the management of patients who undergo surgical procedures in the maxillofacial complex, particularly in reducing and eliminating pain during surgery and aiding patients who have dental fear and are allergic to anesthesia. This case report demonstrates the efficacy of hypnosis in mitigating anxiety, bleeding, and pain during dental surgery without anesthesia during implant placement of tooth 14, the upper left first molar.

  8. Professional Regulation: A Potentially Valuable Tool in Responding to “Stem Cell Tourism”

    Directory of Open Access Journals (Sweden)

    Amy Zarzeczny

    2014-09-01

    Full Text Available The growing international market for unproven stem cell-based interventions advertised on a direct-to-consumer basis over the internet (“stem cell tourism” is a source of concern because of the risks it presents to patients as well as their supporters, domestic health care systems, and the stem cell research field. Emerging responses such as public and health provider-focused education and national regulatory efforts are encouraging, but the market continues to grow. Physicians play a number of roles in the stem cell tourism market and, in many jurisdictions, are members of a regulated profession. In this article, we consider the use of professional regulation to address physician involvement in stem cell tourism. Although it is not without its limitations, professional regulation is a potentially valuable tool that can be employed in response to problematic types of physician involvement in the stem cell tourism market.

  9. MALDI TOF imaging mass spectrometry in clinical pathology: a valuable tool for cancer diagnostics (review).

    Science.gov (United States)

    Kriegsmann, Jörg; Kriegsmann, Mark; Casadonte, Rita

    2015-03-01

    Matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) imaging mass spectrometry (IMS) is an evolving technique in cancer diagnostics and combines the advantages of mass spectrometry (proteomics), detection of numerous molecules, and spatial resolution in histological tissue sections and cytological preparations. This method allows the detection of proteins, peptides, lipids, carbohydrates or glycoconjugates and small molecules.Formalin-fixed paraffin-embedded tissue can also be investigated by IMS, thus, this method seems to be an ideal tool for cancer diagnostics and biomarker discovery. It may add information to the identification of tumor margins and tumor heterogeneity. The technique allows tumor typing, especially identification of the tumor of origin in metastatic tissue, as well as grading and may provide prognostic information. IMS is a valuable method for the identification of biomarkers and can complement histology, immunohistology and molecular pathology in various fields of histopathological diagnostics, especially with regard to identification and grading of tumors.

  10. Professional regulation: a potentially valuable tool in responding to "stem cell tourism".

    Science.gov (United States)

    Zarzeczny, Amy; Caulfield, Timothy; Ogbogu, Ubaka; Bell, Peter; Crooks, Valorie A; Kamenova, Kalina; Master, Zubin; Rachul, Christen; Snyder, Jeremy; Toews, Maeghan; Zoeller, Sonja

    2014-09-09

    The growing international market for unproven stem cell-based interventions advertised on a direct-to-consumer basis over the internet ("stem cell tourism") is a source of concern because of the risks it presents to patients as well as their supporters, domestic health care systems, and the stem cell research field. Emerging responses such as public and health provider-focused education and national regulatory efforts are encouraging, but the market continues to grow. Physicians play a number of roles in the stem cell tourism market and, in many jurisdictions, are members of a regulated profession. In this article, we consider the use of professional regulation to address physician involvement in stem cell tourism. Although it is not without its limitations, professional regulation is a potentially valuable tool that can be employed in response to problematic types of physician involvement in the stem cell tourism market. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Motivational interviewing: a valuable tool for the psychiatric advanced practice nurse.

    Science.gov (United States)

    Karzenowski, Abby; Puskar, Kathy

    2011-01-01

    Motivational Interviewing (MI) is well known and respected by many health care professionals. Developed by Miller and Rollnick (2002) , it is a way to promote behavior change from within and resolve ambivalence. MI is individualized and is most commonly used in the psychiatric setting; it is a valuable tool for the Psychiatric Advanced Nurse Practice Nurse. There are many resources that talk about what MI is and the principles used to apply it. However, there is little information about how to incorporate MI into a clinical case. This article provides a summary of articles related to MI and discusses two case studies using MI and why advanced practice nurses should use MI with their patients.

  12. 3D-Printed specimens as a valuable tool in anatomy education: A pilot study.

    Science.gov (United States)

    Garas, Monique; Vaccarezza, Mauro; Newland, George; McVay-Doornbusch, Kylie; Hasani, Jamila

    2018-06-06

    Three-dimensional (3D) printing is a modern technique of creating 3D-printed models that allows reproduction of human structures from MRI and CT scans via fusion of multiple layers of resin materials. To assess feasibility of this innovative resource as anatomy educational tool, we conducted a preliminary study on Curtin University undergraduate students to investigate the use of 3D models for anatomy learning as a main goal, to assess the effectiveness of different specimen types during the sessions and personally preferred anatomy learning tools among students as secondary aim. The study consisted of a pre-test, exposure to test (anatomical test) and post-test survey. During pre-test, all participants (both without prior experience and experienced groups) were given a brief introduction on laboratory safety and study procedure thus participants were exposed to 3D, wet and plastinated specimens of the heart, shoulder and thigh to identify the pinned structures (anatomical test). Then, participants were provided a post-test survey containing five questions. In total, 23 participants completed the anatomical test and post-test survey. A larger number of participants (85%) achieved right answers for 3D models compared to wet and plastinated materials, 74% of population selected 3D models as the most usable tool for identification of pinned structures and 45% chose 3D models as their preferred method of anatomy learning. This preliminary small-size study affirms the feasibility of 3D-printed models as a valuable asset in anatomy learning and shows their capability to be used adjacent to cadaveric materials and other widely used tools in anatomy education. Copyright © 2018 Elsevier GmbH. All rights reserved.

  13. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998; ......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  14. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    /purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  15. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size....../purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  16. Data-Mining – A Valuable Managerial Tool for Improving Power Plants Efficiency

    Directory of Open Access Journals (Sweden)

    Danubianu Mirela

    2014-05-01

    Full Text Available Energy and environment are top priorities for the EU’s Europe 2020 Strategy. Both fields imply complex approaches and consistent investment. The paper presents an alternative to large investments to improve the efficiencies of existing (outdated power installations: namely the use of data-mining techniques for analysing existing operational data. Data-mining is based upon exhaustive analysis of operational records, inferring high-value information by simply processing records with advanced mathematical / statistical tools. Results can be: assessment of the consistency of measurements, identification of new hardware needed for improving the quality of data, deducing the most efficient level for operation (internal benchmarking, correlation of consumptions with power/ heat production, of technical parameters with environmental impact, scheduling the optimal maintenance time, fuel stock optimization, simulating scenarios for equipment operation, anticipating periods of maximal stress of equipment, identification of medium and long term trends, planning and decision support for new investment, etc. The paper presents a data mining process carried out at the TERMICA - Suceava power plant. The analysis calls for a multidisciplinary approach, a complex team (experts in power&heat production, mechanics, environmental protection, economists, and last but not least IT experts and can be carried out with lower expenses than an investment in new equipment. Involvement of top management of the company is essential, being the driving force and motivation source for the data-mining team. The approach presented is self learning as once established, the data-mining analytical, modelling and simulation procedures and associated parameter databases can adjust themselves by absorbing and processing new relevant information and can be used on a long term basis for monitoring the performance of the installation, certifying the soundness of managerial measures taken

  17. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  18. Value Innovation in Learner-Centered Design. How to Develop Valuable Learning Tools

    Science.gov (United States)

    Breuer, Henning; Schwarz, Heinrich; Feller, Kristina; Matsumoto, Mitsuji

    2014-01-01

    This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of…

  19. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  20. Value Innovation in Learner-Centered Design. How to Develop Valuable Learning Tools.

    Directory of Open Access Journals (Sweden)

    Henning Breuer

    2014-02-01

    Full Text Available This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of (user value innovation that creatively combines ethnographic methods with strategic industry analysis. By focusing on unmet and emerging needs ethnographic research identifies learner values, needs and challenges but does not determine solutions. Blue-ocean strategy tools can identify new opportunities that alter existing offerings but give weak guidance on what will be most relevant to users. The triangulation of both is illustrated through an innovation project in higher education.

  1. Are consumer surveys valuable as a service improvement tool in health services? A critical appraisal.

    Science.gov (United States)

    Patwardhan, Anjali; Patwardhan, Prakash

    2009-01-01

    In the recent climate of consumerism and consumer focused care, health and social care needs to be more responsive than ever before. Consumer needs and preferences can be elicited with accepted validity and reliability only by strict methodological control, customerisation of the questionnaire and skilled interpretation. To construct, conduct, interpret and implement improved service provision, requires a trained work force and infrastructure. This article aims to appraise various aspects of consumer surveys and to assess their value as effective service improvement tools. The customer is the sole reason organisations exist. Consumer surveys are used worldwide as service and quality of care improvement tools by all types of service providers including health service providers. The article critically appraises the value of consumer surveys as service improvement tools in health services tool and its future applications. No one type of survey is the best or ideal. The key is the selection of the correct survey methodology, unique and customised for the particular type/aspect of care being evaluated. The method used should reflect the importance of the information required. Methodological rigor is essential for the effectiveness of consumer surveys as service improvement tools. Unfortunately so far there is no universal consensus on superiority of one particular methodology over another or any benefit of one specific methodology in a given situation. More training and some dedicated resource allocation is required to develop consumer surveys. More research is needed to develop specific survey methodology and evaluation techniques for improved validity and reliability of the surveys as service improvement tools. Measurement of consumer preferences/priorities, evaluation of services and key performance scores, is not easy. Consumer surveys seem impressive tools as they provide the customer a voice for change or modification. However, from a scientific point

  2. Particle induced X-ray emission: a valuable tool for the analysis of metalpoint drawings

    International Nuclear Information System (INIS)

    Duval, A.; Guicharnaud, H.; Dran, J.C.

    2004-01-01

    For several years, we carry out a research on metalpoint drawings, a graphic technique mainly employed by European artists during the 15th and 16th centuries. As a non-destructive and very sensitive analytical technique is required, particle induced X-ray emission (PIXE) analysis with an external beam has been used for this purpose. More than 70 artworks drawn by Italian, Flemish and German artists have been analysed, including leadpoint and silverpoint drawings. Following a short description of the metalpoint technique, the results are compared with the recipes written by Cennino Cennini at the beginning of the 15th century and specific examples are presented

  3. English Digital Dictionaries as Valuable Blended Learning Tools for Palestinian College Students

    Science.gov (United States)

    Dwaik, Raghad A. A.

    2015-01-01

    Digital technology has become an indispensable aspect of foreign language learning around the globe especially in the case of college students who are often required to finish extensive reading assignments within a limited time period. Such pressure calls for the use of efficient tools such as digital dictionaries to help them achieve their…

  4. Spectroscopy applied to feed additives of the European Union Reference Laboratory: a valuable tool for traceability.

    Science.gov (United States)

    Omar, Jone; Slowikowski, Boleslaw; Boix, Ana; von Holst, Christoph

    2017-08-01

    Feed additives need to be authorised to be placed on the market according to Regulation (EU) No. 1831/2003. Next to laying down the procedural requirements, the regulation creates the European Union Reference Laboratory for Feed Additives (EURL-FA) and requires that applicants send samples to the EURL-FA. Once authorised, the characteristics of the marketed feed additives should correspond to those deposited in the sample bank of the EURL-FA. For this purpose, the submitted samples were subjected to near-infrared (NIR) and Raman spectroscopy for spectral characterisation. These techniques have the valuable potential of characterising the feed additives in a non-destructive manner without any complicated sample preparation. This paper describes the capability of spectroscopy for a rapid characterisation of products to establish whether specific authorisation criteria are met. This study is based on the analysis of feed additive samples from different categories and functional groups, namely products containing (1) selenium, (2) zinc and manganese, (3) vitamins and (4) essential oils such as oregano and thyme oil. The use of chemometrics turned out to be crucial, especially in cases where the differentiation of spectra by visual inspection was very difficult.

  5. Hemodynamic exercise testing. A valuable tool in the selection of cardiac transplantation candidates.

    Science.gov (United States)

    Chomsky, D B; Lang, C C; Rayos, G H; Shyr, Y; Yeoh, T K; Pierson, R N; Davis, S F; Wilson, J R

    1996-12-15

    Peak exercise oxygen consumption (Vo2), a noninvasive index of peak exercise cardiac output (CO), is widely used to select candidates for heart transplantation. However, peak exercise Vo2 can be influenced by noncardiac factors such as deconditioning, motivation, or body composition and may yield misleading prognostic information. Direct measurement of the CO response to exercise may avoid this problem and more accurately predict prognosis. Hemodynamic and ventilatory responses to maximal treadmill exercise were measured in 185 ambulatory patients with chronic heart failure who had been referred for cardiac transplantation (mean left ventricular ejection fraction, 22 +/- 7%; mean peak Vo2, 12.9 +/- 3.0 mL. min-1.kg-1). CO response to exercise was normal in 83 patients and reduced in 102. By univariate analysis, patients with normal CO responses had a better 1-year survival rate (95%) than did those with reduced CO responses (72%) (P 14 mL.min-1.kg-1 (88%) was not different from that of patients with peak Vo2 of 10 mL.min-1.kg-1 (89%) (P < .0001). By Cox regression analysis, exercise CO response was the strongest independent predictor of survival (risk ratio, 4.3), with peak Vo2 dichotomized at 10 mL. min-1.kg-1 (risk ratio, 3.3) as the only other independent predictor. Patients with reduced CO responses and peak Vo2 of < or = 10 mL.min-1.kg-1 had an extremely poor 1-year survival rate (38%). Both CO response to exercise and peak exercise Vo2 provide valuable independent prognostic information in ambulatory patients with heart failure. These variables should be used in combination to select potential heart transplantation candidates.

  6. Are patient surveys valuable as a service-improvement tool in health services? An overview

    Directory of Open Access Journals (Sweden)

    Patwardhan A

    2012-05-01

    Full Text Available Anjali Patwardhan,1 Charles H Spencer21Nationwide Children’s Hospital Columbus, 2Ohio State University, Columbus, OH, USAAbstract: Improving the quality of care in international health services was made a high priority in 1977. The World Health Assembly passed a resolution to greatly improve “Health for all” by the year 2000. Since 1977, the use of patient surveys for quality improvement has become a common practice in the health-care industry. The use of surveys reflects the concept that patient satisfaction is closely linked with that of organizational performance, which is in turn closely linked with organizational culture. This article is a review of the role of patient surveys as a quality-improvement tool in health care. The article explores the characteristics, types, merits, and pitfalls of various patient surveys, as well as the impact of their wide-ranging application in dissimilar scenarios to identify gaps in service provision. It is demonstrated that the conducting of patient surveys and using the results to improve the quality of care are two different processes. The value of patient surveys depends on the interplay between these two processes and several other factors that can influence the final outcome. The article also discusses the business aspect of the patient surveys in detail. Finally, the authors make future recommendations on how the patient survey tool can be best used to improve the quality of care in the health-care sector.Keywords: patient surveys, quality improvement, service gaps 

  7. Umbilical artery doppler velocimetry: a valuable tool for antenatal fetal surveillance

    International Nuclear Information System (INIS)

    Khawar, N.; Umber, A.

    2013-01-01

    To determine umbilical artery Doppler velocity parameter systolic: diastolic ratio (S/D ratio) relation with fetal well being and outcome. Setting: Department of Obstetrics and Gynecology, Lady Willingdon Hospital, Lahore Duration of study: Six months from 27-02-2008 to 26-08-2008. Subjects and methods: Sixty patients fulfilling the inclusion criteria were included in this study. They were subdivided into two groups. Group 'A' included 30 normal pregnant women with no medical or obstetrical risk factors and group 'B' included 30 pregnant women having risk factors like, hypertension, diabetes, Rhesus incompatibility, discordant twins, intrauterine growth restriction and non immunehydropsfetalis. Results: In comparison of S/D ratio with risk factors it was observed that S/D ratio 3 was present in 19 patients (31.6%) in pregnancy with hypertension/preeclampsia, 3 patients (5%) with diabetes mellitus, 11 patients (18.3%) with intrauterine growth restriction, 15 patients (25.0%) with oligohydramnios and only 1 patient (1.6%) with twin pregnancy. It was observed that women with S/D ratio 3 S/D ratio delivered 10 neonates (16.6%) with <4 Apgar score at 1 minute, 23 (38.3%) with <6 score at 5 minutes and 23 neonates (38.3%) needed resuscitation, 21 (35.0%) were admitted to neonatal unit for asphyxia. Conclusion: Umbilical artery Doppler studies is an integral tool while evaluating health of high risk pregnancies. However, it is not appropriate as a screening tool for low risk pregnancies. (author)

  8. Substrate topography: A valuable in vitro tool, but a clinical red herring for in vivo tenogenesis.

    Science.gov (United States)

    English, Andrew; Azeem, Ayesha; Spanoudes, Kyriakos; Jones, Eleanor; Tripathi, Bhawana; Basu, Nandita; McNamara, Karrina; Tofail, Syed A M; Rooney, Niall; Riley, Graham; O'Riordan, Alan; Cross, Graham; Hutmacher, Dietmar; Biggs, Manus; Pandit, Abhay; Zeugolis, Dimitrios I

    2015-11-01

    Controlling the cell-substrate interactions at the bio-interface is becoming an inherent element in the design of implantable devices. Modulation of cellular adhesion in vitro, through topographical cues, is a well-documented process that offers control over subsequent cellular functions. However, it is still unclear whether surface topography can be translated into a clinically functional response in vivo at the tissue/device interface. Herein, we demonstrated that anisotropic substrates with a groove depth of ∼317nm and ∼1988nm promoted human tenocyte alignment parallel to the underlying topography in vitro. However, the rigid poly(lactic-co-glycolic acid) substrates used in this study upregulated the expression of chondrogenic and osteogenic genes, indicating possible tenocyte trans-differentiation. Of significant importance is that none of the topographies assessed (∼37nm, ∼317nm and ∼1988nm groove depth) induced extracellular matrix orientation parallel to the substrate orientation in a rat patellar tendon model. These data indicate that two-dimensional imprinting technologies are useful tools for in vitro cell phenotype maintenance, rather than for organised neotissue formation in vivo, should multifactorial approaches that consider both surface topography and substrate rigidity be established. Herein, we ventured to assess the influence of parallel groves, ranging from nano- to micro-level, on tenocytes response in vitro and on host response using a tendon and a subcutaneous model. In vitro analysis indicates that anisotropically ordered micro-scale grooves, as opposed to nano-scale grooves, maintain physiological cell morphology. The rather rigid PLGA substrates appeared to induce trans-differentiation towards chondrogenic and/or steogenic lineage, as evidence by TILDA gene analysis. In vivo data in both tendon and subcutaneous models indicate that none of the substrates induced bidirectional host cell and tissue growth. Collective, these

  9. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-01-01

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819

  10. PRIDE and "Database on Demand" as valuable tools for computational proteomics.

    Science.gov (United States)

    Vizcaíno, Juan Antonio; Reisinger, Florian; Côté, Richard; Martens, Lennart

    2011-01-01

    The Proteomics Identifications Database (PRIDE, http://www.ebi.ac.uk/pride ) provides users with the ability to explore and compare mass spectrometry-based proteomics experiments that reveal details of the protein expression found in a broad range of taxonomic groups, tissues, and disease states. A PRIDE experiment typically includes identifications of proteins, peptides, and protein modifications. Additionally, many of the submitted experiments also include the mass spectra that provide the evidence for these identifications. Finally, one of the strongest advantages of PRIDE in comparison with other proteomics repositories is the amount of metadata it contains, a key point to put the above-mentioned data in biological and/or technical context. Several informatics tools have been developed in support of the PRIDE database. The most recent one is called "Database on Demand" (DoD), which allows custom sequence databases to be built in order to optimize the results from search engines. We describe the use of DoD in this chapter. Additionally, in order to show the potential of PRIDE as a source for data mining, we also explore complex queries using federated BioMart queries to integrate PRIDE data with other resources, such as Ensembl, Reactome, or UniProt.

  11. Per-operative vibration analysis: a valuable tool for defining correct stem insertion: preliminary report.

    Science.gov (United States)

    Mulier, Michiel; Pastrav, Cesar; Van der Perre, Georges

    2008-01-01

    Defining the stem insertion end point during total hip replacement still relies on the surgeon's feeling. When a custom-made stem prosthesis with an optimal fit into the femoral canal is used, the risk of per-operative fractures is even greater than with standard prostheses. Vibration analysis is used in other clinical settings and has been tested as a means to detect optimal stem insertion in the laboratory. The first per-operative use of vibration analysis during non-cemented custom-made stem insertion in 30 patients is reported here. Thirty patients eligible for total hip replacement with uncemented stem prosthesis were included. The neck of the stem was connected with a shaker that emitted white noise as excitation signal and an impedance head that measured the frequency response. The response signal was sent to a computer that analyzed the frequency response function after each insertion phase. A technician present in the operating theatre but outside the laminated airflow provided feed-back to the surgeon. The correlation index between the frequency response function measured during the last two insertion hammering sessions was >0.99 in 86.7% of the cases. In four cases the surgeon stopped the insertion procedure because of a perceived risk of fracture. Two special cases illustrating the potential benefit of per-operative vibration analysis are described. The results of intra-operative vibration analysis indicate that this technique may be a useful tool assisting the orthopaedic surgeon in defining the insertion endpoint of the stem. The development of a more user-friendly device is therefore warranted.

  12. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Drachsler, Hendrik; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  13. PIXE as an analytical/educational tool

    International Nuclear Information System (INIS)

    Williams, E.T.

    1991-01-01

    The advantages and disadvantages of PIXE as an analytical method will be summarized. The authors will also discuss the advantages of PIXE as a means of providing interesting and feasible research projects for undergraduate science students

  14. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  15. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  16. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  17. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  18. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  19. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  20. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  1. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  2. Agrobacterium rhizogenes-mediated transformation of Superroot-derived Lotus corniculatus plants: a valuable tool for functional genomics

    Directory of Open Access Journals (Sweden)

    Liu Wei

    2009-06-01

    reach 92% based on GUS detection. The combination of the highly efficient transformation and the regeneration system of Superroot provides a valuable tool for functional genomics studies in L. corniculatus.

  3. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Combined measurement of plasma cystatin C and low-density lipoprotein cholesterol: A valuable tool for evaluating progressive supranuclear palsy.

    Science.gov (United States)

    Weng, Ruihui; Wei, Xiaobo; Yu, Bin; Zhu, Shuzhen; Yang, Xiaohua; Xie, Fen; Zhang, Mahui; Jiang, Ying; Feng, Zhong-Ping; Sun, Hong-Shuo; Xia, Ying; Jin, Kunlin; Chan, Piu; Wang, Qing; Gao, Xiaoya

    2018-07-01

    Progressive supranuclear palsy (PSP) was previously thought as a cause of atypical Parkinsonism. Although Cystatin C (Cys C) and low-density cholesterol lipoprotein-C (LDL-C) are known to play critical roles in Parkinsonism, it is unknown whether they can be used as markers to distinguish PSP patients from healthy subjects and to determine disease severity. We conducted a cross-sectional study to determine plasma Cys C/HDL/LDL-C levels of 40 patients with PSP and 40 healthy age-matched controls. An extended battery of motor and neuropsychological tests, including the PSP-Rating Scale (PSPRS), the Non-Motor Symptoms Scale (NMSS), Geriatric Depression Scale (GDS) and Mini-Mental State Examination (MMSE), was used to evaluate the disease severity. Receiver operating characteristic (ROC) curves were adopted to assess the prognostic accuracy of Cys C/LDL-C levels in distinguishing PSP from healthy subjects. Patients with PSP exhibited significantly higher plasma levels of Cys C and lower LDL-C. The levels of plasma Cys C were positively and inversely correlated with the PSPRS/NMSS and MMSE scores, respectively. The LDL-C/HDL-C ratio was positively associated with PSPRS/NMSS and GDS scores. The ROC curve for the combination of Cys C and LDL-C yielded a better accuracy for distinguishing PSP from healthy subjects than the separate curves for each parameter. Plasma Cys C and LDL-C may be valuable screening tools for differentiating PSP from healthy subjects; while they could be useful for the PSP intensifies and severity evaluation. A better understanding of Cys C and LDL-C may yield insights into the pathogenesis of PSP. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  6. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  7. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  8. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  9. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  10. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  11. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  12. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  13. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  14. Spectroelectrochemistry: A valuable tool for the study of organometallic-alkyne, -vinylidene, -cumulene, -alkynyl and related complexes

    International Nuclear Information System (INIS)

    Low, Paul J.; Bock, Sören

    2013-01-01

    This review presents a highly selective summary of spectroelectrochemical methods used in the study of metal alkyne, acetylide, vinylidene and allenylidene complexes. The review is illustrated predominantly by the selected examples from the authors’ group that formed the basis of a lecture at the recent ISE Annual Meeting. Emphasis is placed on the use of spectroelectrochemical methods to study redox-induced ligand isomerisation reactions, and determination of molecular electronic structure, which complement the conventional tools available to the synthetic chemist for characterisation of molecular compounds. The role of computational studies in supporting the interpretation of spectroscopic data is also briefly discussed

  15. Primary culture of glial cells from mouse sympathetic cervical ganglion: a valuable tool for studying glial cell biology.

    Science.gov (United States)

    de Almeida-Leite, Camila Megale; Arantes, Rosa Maria Esteves

    2010-12-15

    Central nervous system glial cells as astrocytes and microglia have been investigated in vitro and many intracellular pathways have been clarified upon various stimuli. Peripheral glial cells, however, are not as deeply investigated in vitro despite its importance role in inflammatory and neurodegenerative diseases. Based on our previous experience of culturing neuronal cells, our objective was to standardize and morphologically characterize a primary culture of mouse superior cervical ganglion glial cells in order to obtain a useful tool to study peripheral glial cell biology. Superior cervical ganglia from neonatal C57BL6 mice were enzymatically and mechanically dissociated and cells were plated on diluted Matrigel coated wells in a final concentration of 10,000cells/well. Five to 8 days post plating, glial cell cultures were fixed for morphological and immunocytochemical characterization. Glial cells showed a flat and irregular shape, two or three long cytoplasm processes, and round, oval or long shaped nuclei, with regular outline. Cell proliferation and mitosis were detected both qualitative and quantitatively. Glial cells were able to maintain their phenotype in our culture model including immunoreactivity against glial cell marker GFAP. This is the first description of immunocytochemical characterization of mouse sympathetic cervical ganglion glial cells in primary culture. This work discusses the uses and limitations of our model as a tool to study many aspects of peripheral glial cell biology. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  17. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  18. MIDAS (Migraine Disability Assessment: a valuable tool for work-site identification of migraine in workers in Brazil

    Directory of Open Access Journals (Sweden)

    Yara Dadalti Fragoso

    Full Text Available CONTEXT: MIDAS was developed as a fast and efficient method for identification of migraine in need of medical evaluation and treatment. It was necessary to translate MIDAS, originally written in English, so as to apply it in Brazil and make it usable by individuals from a variety of social-economic-cultural backgrounds. OBJECTIVE: To translate and to apply MIDAS in Brazil. SETTING: Assessment of a sample of workers regularly employed by an oil refinery. SETTING: Refinaria Presidente Bernardes, Cubatão, São Paulo, Brazil. PARTICIPANTS: 404 workers of the company who correctly answered a questionnaire for the identification and evaluation of headache. When the individual considered it to be pertinent to his own needs, there was the option to answer MIDAS as well. METHODS: MIDAS, originally written in English, was translated into Brazilian Portuguese by a neurologist and by a translator specializing in medical texts. The final version of the translation was obtained when, for ten patients to whom it was applied, the text seemed clear and the results were consistent over three sessions. MAIN MEASUREMENTS: Prevalence and types of primary headaches, evaluation of MIDAS as a tool for identification of more severe cases. RESULTS: From the total of 419 questionnaires given to the employees, 404 were returned correctly completed. From these, 160 persons were identified as presenting headaches, 44 of whom considered it worthwhile answering MIDAS. Nine of these individuals who answered MIDAS were identified as severe cases of migraine due to disability caused by the condition. An interview on a later date confirmed these results. Three were cases of chronic daily headache (transformed migraine and six were cases of migraine. CONCLUSIONS: MIDAS translated to Brazilian Portuguese was a useful tool for identifying severe cases of migraine and of transformed migraine in a working environment. The workers did not consider MIDAS to be difficult to answer. Their

  19. λ5-Phosphorus-Containing α-Diazo Compounds: A Valuable Tool for Accessing Phosphorus-Functionalized Molecules.

    Science.gov (United States)

    Marinozzi, Maura; Pertusati, Fabrizio; Serpi, Michaela

    2016-11-23

    The compounds characterized by the presence of a λ 5 -phosphorus functionality at the α-position with respect to the diazo moiety, here referred to as λ 5 -phosphorus-containing α-diazo compounds (PCDCs), represent a vast class of extremely versatile reagents in organic chemistry and are particularly useful in the preparation of phosphonate- and phosphinoxide-functionalized molecules. Indeed, thanks to the high reactivity of the diazo moiety, PCDCs can be induced to undergo a wide variety of chemical transformations. Among them are carbon-hydrogen, as well as heteroatom-hydrogen insertion reactions, cyclopropanation, ylide formation, Wolff rearrangement, and cycloaddition reactions. PCDCs can be easily prepared from readily accessible precursors by a variety of different methods, such as diazotization, Bamford-Stevens-type elimination, and diazo transfer reactions. This evidence along with their relative stability and manageability make them appealing tools in organic synthesis. This Review aims to demonstrate the ongoing utility of PCDCs in the modern preparation of different classes of phosphorus-containing compounds, phosphonates, in particular. Furthermore, to address the lack of precedent collective papers, this Review also summarizes the methods for PCDCs preparation.

  20. Transformed hairy roots of Discaria trinervis: a valuable tool for studying actinorhizal symbiosis in the context of intercellular infection.

    Science.gov (United States)

    Imanishi, Leandro; Vayssières, Alice; Franche, Claudine; Bogusz, Didier; Wall, Luis; Svistoonoff, Sergio

    2011-11-01

    Among infection mechanisms leading to root nodule symbiosis, the intercellular infection pathway is probably the most ancestral but also one of the least characterized. Intercellular infection has been described in Discaria trinervis, an actinorhizal plant belonging to the Rosales order. To decipher the molecular mechanisms underlying intercellular infection with Frankia bacteria, we set up an efficient genetic transformation protocol for D. trinervis based on Agrobacterium rhizogenes. We showed that composite plants with transgenic roots expressing green fluorescent protein can be specifically and efficiently nodulated by Frankia strain BCU110501. Nitrogen fixation rates and feedback inhibition of nodule formation by nitrogen were similar in control and composite plants. In order to challenge the transformation system, the MtEnod11 promoter, a gene from Medicago truncatula widely used as a marker for early infection-related symbiotic events in model legumes, was introduced in D. trinervis. MtEnod11::GUS expression was related to infection zones in root cortex and in the parenchyma of the developing nodule. The ability to study intercellular infection with molecular tools opens new avenues for understanding the evolution of the infection process in nitrogen-fixing root nodule symbioses.

  1. Quantitative PCR is a Valuable Tool to Monitor the Performance of DNA-Encoded Chemical Library Selections.

    Science.gov (United States)

    Li, Yizhou; Zimmermann, Gunther; Scheuermann, Jörg; Neri, Dario

    2017-05-04

    Phage-display libraries and DNA-encoded chemical libraries (DECLs) represent useful tools for the isolation of specific binding molecules from large combinatorial sets of compounds. With both methods, specific binders are recovered at the end of affinity capture procedures by using target proteins of interest immobilized on a solid support. However, although the efficiency of phage-display selections is routinely quantified by counting the phage titer before and after the affinity capture step, no similar quantification procedures have been reported for the characterization of DECL selections. In this article, we describe the potential and limitations of quantitative PCR (qPCR) methods for the evaluation of selection efficiency by using a combinatorial chemical library with more than 35 million compounds. In the experimental conditions chosen for the selections, a quantification of DNA input/recovery over five orders of magnitude could be performed, revealing a successful enrichment of abundant binders, which could be confirmed by DNA sequencing. qPCR provided rapid information about the performance of selections, thus facilitating the optimization of experimental conditions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. The Rg1 allele as a valuable tool for genetic transformation of the tomato 'Micro-Tom' model system

    Directory of Open Access Journals (Sweden)

    Quecini Vera

    2010-10-01

    Full Text Available Abstract Background The cultivar Micro-Tom (MT is regarded as a model system for tomato genetics due to its short life cycle and miniature size. However, efforts to improve tomato genetic transformation have led to protocols dependent on the costly hormone zeatin, combined with an excessive number of steps. Results Here we report the development of a MT near-isogenic genotype harboring the allele Rg1 (MT-Rg1, which greatly improves tomato in vitro regeneration. Regeneration was further improved in MT by including a two-day incubation of cotyledonary explants onto medium containing 0.4 μM 1-naphthaleneacetic acid (NAA before cytokinin treatment. Both strategies allowed the use of 5 μM 6-benzylaminopurine (BAP, a cytokinin 100 times less expensive than zeatin. The use of MT-Rg1 and NAA pre-incubation, followed by BAP regeneration, resulted in high transformation frequencies (near 40%, in a shorter protocol with fewer steps, spanning approximately 40 days from Agrobacterium infection to transgenic plant acclimatization. Conclusions The genetic resource and the protocol presented here represent invaluable tools for routine gene expression manipulation and high throughput functional genomics by insertional mutagenesis in tomato.

  3. Estimation of the solubility parameters of model plant surfaces and agrochemicals: a valuable tool for understanding plant surface interactions.

    Science.gov (United States)

    Khayet, Mohamed; Fernández, Victoria

    2012-11-14

    Most aerial plant parts are covered with a hydrophobic lipid-rich cuticle, which is the interface between the plant organs and the surrounding environment. Plant surfaces may have a high degree of hydrophobicity because of the combined effects of surface chemistry and roughness. The physical and chemical complexity of the plant cuticle limits the development of models that explain its internal structure and interactions with surface-applied agrochemicals. In this article we introduce a thermodynamic method for estimating the solubilities of model plant surface constituents and relating them to the effects of agrochemicals. Following the van Krevelen and Hoftyzer method, we calculated the solubility parameters of three model plant species and eight compounds that differ in hydrophobicity and polarity. In addition, intact tissues were examined by scanning electron microscopy and the surface free energy, polarity, solubility parameter and work of adhesion of each were calculated from contact angle measurements of three liquids with different polarities. By comparing the affinities between plant surface constituents and agrochemicals derived from (a) theoretical calculations and (b) contact angle measurements we were able to distinguish the physical effect of surface roughness from the effect of the chemical nature of the epicuticular waxes. A solubility parameter model for plant surfaces is proposed on the basis of an increasing gradient from the cuticular surface towards the underlying cell wall. The procedure enabled us to predict the interactions among agrochemicals, plant surfaces, and cuticular and cell wall components, and promises to be a useful tool for improving our understanding of biological surface interactions.

  4. Portable hyperspectral device as a valuable tool for the detection of protective agents applied on hystorical buildings

    Science.gov (United States)

    Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.

    2012-04-01

    In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration [1]. The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in

  5. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  6. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  7. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  8. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  9. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  10. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  11. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  12. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  13. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  14. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  15. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  16. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  17. Electrochemical sensors: a powerful tool in analytical chemistry

    Directory of Open Access Journals (Sweden)

    Stradiotto Nelson R.

    2003-01-01

    Full Text Available Potentiometric, amperometric and conductometric electrochemical sensors have found a number of interesting applications in the areas of environmental, industrial, and clinical analyses. This review presents a general overview of the three main types of electrochemical sensors, describing fundamental aspects, developments and their contribution to the area of analytical chemistry, relating relevant aspects of the development of electrochemical sensors in Brazil.

  18. Laser-induced plasma spectrometry: truly a surface analytical tool

    International Nuclear Information System (INIS)

    Vadillo, Jose M.; Laserna, J.

    2004-01-01

    For a long period, analytical applications of laser induced plasma spectrometry (LIPS) have been mainly restricted to overall and quantitative determination of elemental composition in bulk, solid samples. However, introduction of new compact and reliable solid state lasers and technological development in multidimensional intensified detectors have made possible the seeking of new analytical niches for LIPS where its analytical advantages (direct sampling from any material irrespective of its conductive status without sample preparation and with sensitivity adequate for many elements in different matrices) could be fully exploited. In this sense, the field of surface analysis could take advantage from the cited advantages taking into account in addition, the capability of LIPS for spot analysis, line scan, depth-profiling, area analysis and compositional mapping with a single instrument in air at atmospheric pressure. This review paper outlines the fundamental principles of laser-induced plasma emission relevant to sample surface studies, discusses the experimental parameters governing the spatial (lateral and in-depth) resolution in LIPS analysis and presents the applications concerning surface examination

  19. Process simulation of heavy water plants - a powerful analytical tool

    International Nuclear Information System (INIS)

    Miller, A.I.

    1978-10-01

    The commercially conscious designs of Canadian GS (Girdler-Sulphide) have proved sensitive to process conditions. That, combined with the large scale of our units, has meant that computer simulation of their behaviour has been a natural and profitable development. Atomic Energy of Canada Limited has developed a family of steady state simulations to describe all of the Canadian plants. Modelling of plant conditions has demonstrated that the simulation description is very precise and it has become an integral part of the industry's assessments of both plant operation and decisions on capital expenditures. The simulation technique has also found extensive use in detailed designing of both the rehabilitated Glace Bay and the new La Prade plants. It has opened new insights into plant design and uncovered a radical and significant flowsheet change for future designs as well as many less dramatic but valuable lesser changes. (author)

  20. Mapping healthcare systems: a policy relevant analytic tool.

    Science.gov (United States)

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  1. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...

  2. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  3. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  4. MALDI-TOF mass spectrometry following short incubation on a solid medium is a valuable tool for rapid pathogen identification from positive blood cultures.

    Science.gov (United States)

    Kohlmann, Rebekka; Hoffmann, Alexander; Geis, Gabriele; Gatermann, Sören

    2015-01-01

    approach allowed an optimized treatment recommendation. MALDI-TOF MS following 4h pre-culture is a valuable tool for rapid pathogen identification from positive blood cultures, allowing easy integration in diagnostic routine and the opportunity of considerably earlier treatment adaptation. Copyright © 2015 Elsevier GmbH. All rights reserved.

  5. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  6. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  7. Improving web site performance using commercially available analytical tools.

    Science.gov (United States)

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  8. Practical applications of surface analytic tools in tribology

    Science.gov (United States)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  9. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili

    2004-01-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  10. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  11. Cluster Analysis as an Analytical Tool of Population Policy

    Directory of Open Access Journals (Sweden)

    Oksana Mikhaylovna Shubat

    2017-12-01

    Full Text Available The predicted negative trends in Russian demography (falling birth rates, population decline actualize the need to strengthen measures of family and population policy. Our research purpose is to identify groups of Russian regions with similar characteristics in the family sphere using cluster analysis. The findings should make an important contribution to the field of family policy. We used hierarchical cluster analysis based on the Ward method and the Euclidean distance for segmentation of Russian regions. Clustering is based on four variables, which allowed assessing the family institution in the region. The authors used the data of Federal State Statistics Service from 2010 to 2015. Clustering and profiling of each segment has allowed forming a model of Russian regions depending on the features of the family institution in these regions. The authors revealed four clusters grouping regions with similar problems in the family sphere. This segmentation makes it possible to develop the most relevant family policy measures in each group of regions. Thus, the analysis has shown a high degree of differentiation of the family institution in the regions. This suggests that a unified approach to population problems’ solving is far from being effective. To achieve greater results in the implementation of family policy, a differentiated approach is needed. Methods of multidimensional data classification can be successfully applied as a relevant analytical toolkit. Further research could develop the adaptation of multidimensional classification methods to the analysis of the population problems in Russian regions. In particular, the algorithms of nonparametric cluster analysis may be of relevance in future studies.

  12. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan [Iowa State Univ., Ames, IA (United States)

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  14. Positron spectroscopy as an analytical tool in material sciences

    International Nuclear Information System (INIS)

    Pujari, P.K.

    2010-01-01

    Full text: Positron annihilation spectroscopy has emerged as a powerful tool in material sciences due to its ability to provide information about the electron momentum distribution and electron density in a given medium. These features help in identifying altered state of electronic rearrangements as one encounters in phase transitions. In addition, positrons prefer regions of lower electron density such as open volume defects i.e. vacancies or vacancy clusters in metals, alloys and semiconductors or free-volumes in molecular solids. Its sensitivity to defects is extremely high e.g. it can detect as small a defect as monovacancy to concentration as low as parts per million(ppm). Innovative nuclear instrumentation has helped in getting chemical specificity at the annihilation site. For example, precipitates, embedded, nanoparticles or element decorated vacancies can now be easily identified. This presentation is structured to introduce the technique and provide a global perspective on area of applications. Specific examples on defect characterization, nanostructure-property correlations in polymers, advantages of elemental specificity by indexing the core electron momentum will be given. In addition, slow positron beam based studies on nanostructured materials as well as particle accelerator based positron spectroscopy for volumetric assay of defects in large engineering samples will be presented

  15. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  16. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers.

    Science.gov (United States)

    Al Handawi, Khalil; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia

    2017-09-28

    Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs) for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI) change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber's modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR) while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  17. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  18. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  19. A little bird told me : Twitter may be growing at 700 per cent a week, but is it a valuable tool for the patch, or a distraction?

    Energy Technology Data Exchange (ETDEWEB)

    Stastny, P

    2009-10-15

    The social networking tool Twitter may soon be adopted by petroleum industry workers as a means of ensuring increased communications. Comprised of social networks, link sharing, and live searching, the tool can be used to conduct subject searchers as well as to link to quarterly reports and press releases. Twitter is also being used to manage crisis communications as well as to monitor activities on the Internet. Twitter may also provide a means for oil and gas operators to follow influential industry bloggers as well as to develop effective communications strategies. It was concluded that Twitter may offer an opportunity for companies to participate in non-traditional communications approaches such as online forums, and other media-sharing tools. 1 fig.

  20. Challenging and valuable

    NARCIS (Netherlands)

    Van Hal, J.D.M.

    2008-01-01

    Challenging and valuable Inaugural speech given on May 7th 2008 at the occasion of the acceptance of the position of Professor Sustainable Housing Transformation at the faculty of Architeeture of the Delft University of Technology by Prof. J.D.M. van Hal MSc PhD.

  1. Process analytical technology (PAT) tools for the cultivation step in biopharmaceutical production

    NARCIS (Netherlands)

    Streefland, M.; Martens, D.E.; Beuvery, E.C.; Wijffels, R.H.

    2013-01-01

    The process analytical technology (PAT) initiative is now 10 years old. This has resulted in the development of many tools and software packages dedicated to PAT application on pharmaceutical processes. However, most applications are restricted to small molecule drugs, mainly for the relatively

  2. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  3. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  4. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  5. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    Science.gov (United States)

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  6. Construction of Two mCherry Plasmids (pXG-mCherry for Transgenic Leishmania: Valuable Tools for Future Molecular Analysis

    Directory of Open Access Journals (Sweden)

    Andrés Vacas

    2017-01-01

    Full Text Available Leishmania is the causative agent of leishmaniasis, a neglected tropical disease that affects more than 12 million people around the world. Current treatments are toxic and poorly effective due to the acquisition of resistance within Leishmania populations. Thus, the pursuit for new antileishmanial drugs is a priority. The available methods for drug screening based on colorimetric assays using vital dyes are time-consuming. Currently, the use of fluorescent reporter proteins is replacing the use of viability indicator dyes. We have constructed two plasmids expressing the red fluorescent protein mCherry with multiple cloning sites (MCS, adequate for N- and C-terminal fusion protein constructs. Our results also show that the improved pXG-mCherry plasmid can be employed for drug screening in vitro. The use of the red fluorescent protein, mCherry, is an easier tool for numerous assays, not only to test pharmacological compounds, but also to determine the subcellular localization of proteins.

  7. Specificity enhancement by electrospray ionization multistage mass spectrometry--a valuable tool for differentiation and identification of 'V'-type chemical warfare agents.

    Science.gov (United States)

    Weissberg, Avi; Tzanani, Nitzan; Dagan, Shai

    2013-12-01

    The use of chemical warfare agents has become an issue of emerging concern. One of the challenges in analytical monitoring of the extremely toxic 'V'-type chemical weapons [O-alkyl S-(2-dialkylamino)ethyl alkylphosphonothiolates] is to distinguish and identify compounds of similar structure. MS analysis of these compounds reveals mostly fragment/product ions representing the amine-containing residue. Hence, isomers or derivatives with the same amine residue exhibit similar mass spectral patterns in both classical EI/MS and electrospray ionization-MS, leading to unavoidable ambiguity in the identification of the phosphonate moiety. A set of five 'V'-type agents, including O-ethyl S-(2-diisopropylamino)ethyl methylphosphonothiolate (VX), O-isobutyl S-(2-diethylamino)ethyl methylphosphonothiolate (RVX) and O-ethyl S-(2-diethylamino)ethyl methylphosphonothiolate (VM) were studied by liquid chromatography/electrospray ionization/MS, utilizing a QTRAP mass detector. MS/MS enhanced product ion scans and multistage MS(3) experiments were carried out. Based on the results, possible fragmentation pathways were proposed, and a method for the differentiation and identification of structural isomers and derivatives of 'V'-type chemical warfare agents was obtained. MS/MS enhanced product ion scans at various collision energies provided information-rich spectra, although many of the product ions obtained were at low abundance. Employing MS(3) experiments enhanced the selectivity for those low abundance product ions and provided spectra indicative of the different phosphonate groups. Study of the fragmentation pathways, revealing some less expected structures, was carried out and allowed the formulation of mechanistic rules and the determination of sets of ions typical of specific groups, for example, methylphosphonothiolates versus ethylphosphonothiolates. The new group-specific ions elucidated in this work are also useful for screening unknown 'V'-type agents and related

  8. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  9. Recovering valuable liquid hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Pier, M

    1931-06-11

    A process for recovering valuable liquid hydrocarbons from coking coal, mineral coal, or oil shale through treatment with hydrogen under pressure at elevated temperature is described. Catalysts and grinding oil may be used in the process if necessary. The process provides for deashing the coal prior to hydrogenation and for preventing the coking and swelling of the deashed material. During the treatment with hydrogen, the coal is either mixed with coal low in bituminous material, such as lean coal or active coal, as a diluent or the bituminous constituents which cause the coking and swelling are removed by extraction with solvents. (BLM)

  10. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  11. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  12. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  13. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  14. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  15. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    Science.gov (United States)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  16. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  17. Facebook: A Potentially Valuable Educational Tool?

    Science.gov (United States)

    Voivonta, Theodora; Avraamidou, Lucy

    2018-01-01

    This paper is concerned with the educational value of Facebook and specifically how it can be used in formal educational settings. As such, it provides a review of existing literature of how Facebook is used in higher education paying emphasis on the scope of its use and the outcomes achieved. As evident in existing literature, Facebook has been…

  18. Facebook : A potentially valuable educational tool?

    NARCIS (Netherlands)

    Voivonta, Theodora; Avraamidou, Lucy

    2018-01-01

    This paper is concerned with the educational value of Facebook and specifically how it can be used in formal educational settings. As such, it provides a review of existing literature of how Facebook is used in higher education paying emphasis on the scope of its use and the outcomes achieved. As

  19. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  20. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  1. Development of the oil spill response cost-effectiveness analytical tool

    International Nuclear Information System (INIS)

    Etkin, D.S.; Welch, J.

    2005-01-01

    Decision-making during oil spill response operations or contingency planning requires balancing the need to remove as much oil as possible from the environment with the desire to minimize the impact of response operations on the environment they are intended to protect. This paper discussed the creation of a computer tool developed to help in planning and decision-making during response operations. The Oil Spill Response Cost-Effectiveness Analytical Tool (OSRCEAT) was developed to compare the costs of response with the benefits of response in both hypothetical and actual oil spills. The computer-based analytical tool can assist responders and contingency planners in decision-making processes as well as act as a basis of discussion in the evaluation of response options. Using inputs on spill parameters, location and response options, OSRCEAT can calculate response cost, costs of environmental and socioeconomic impacts of the oil spill and response impacts. Oil damages without any response are contrasted to oil damages with response, with expected improvements. Response damages are subtracted from the difference in damages with and without response in order to derive a more accurate response benefit. An OSRCEAT user can test various response options to compare potential benefits in order to maximize response benefit. OSRCEAT is best used to compare and contrast the relative benefits and costs of various response options. 50 refs., 19 tabs., 2 figs

  2. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  3. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  4. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  5. FIGURED WORLDS AS AN ANALYTIC AND METHODOLOGICAL TOOL IN PROFESSIONAL TEACHER DEVELOPMENT

    DEFF Research Database (Denmark)

    Møller, Hanne; Brok, Lene Storgaard

    . Hasse (2015) and Holland (1998) have inspired our study; i.e., learning is conceptualized as a social phenomenon, implying that contexts of learning are decisive for learner identity. The concept of Figured Worlds is used to understand the development and the social constitution of emergent interactions......,“(Holland et al., 1998, p. 52) and gives a framework for understanding meaning-making in particular pedagogical settings. We exemplify our use of the term Figured Worlds, both as an analytic and methodological tool for empirical studies in kindergarten and school. Based on data sources, such as field notes...

  6. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  7. Kalisphera: an analytical tool to reproduce the partial volume effect of spheres imaged in 3D

    International Nuclear Information System (INIS)

    Tengattini, Alessandro; Andò, Edward

    2015-01-01

    In experimental mechanics, where 3D imaging is having a profound effect, spheres are commonly adopted for their simplicity and for the ease of their modeling. In this contribution we develop an analytical tool, ‘kalisphera’, to produce 3D raster images of spheres including their partial volume effect. This allows us to evaluate the metrological performance of existing image-based measurement techniques (knowing a priori the ground truth). An advanced application of ‘kalisphera’ is developed here to identify and accurately characterize spheres in real 3D x-ray tomography images with the objective of improving trinarization and contact detection. The effect of the common experimental imperfections is assessed and the overall performance of the tool tested on real images. (paper)

  8. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  9. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London.

    Science.gov (United States)

    Lennox, Laura; Doyle, Cathal; Reed, Julie E; Bell, Derek

    2017-09-24

    Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). CLAHRC NWL improvement initiative teams and staff. The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with

  10. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  11. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Directory of Open Access Journals (Sweden)

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  12. Capturing student mathematical engagement through differently enacted classroom practices: applying a modification of Watson's analytical tool

    Science.gov (United States)

    Patahuddin, Sitti Maesuri; Puteri, Indira; Lowrie, Tom; Logan, Tracy; Rika, Baiq

    2018-04-01

    This study examined student mathematical engagement through the intended and enacted lessons taught by two teachers in two different middle schools in Indonesia. The intended lesson was developed using the ELPSA learning design to promote mathematical engagement. Based on the premise that students will react to the mathematical tasks in the forms of words and actions, the analysis focused on identifying the types of mathematical engagement promoted through the intended lesson and performed by students during the lesson. Using modified Watson's analytical tool (2007), students' engagement was captured from what the participants' did or said mathematically. We found that teachers' enacted practices had an influence on student mathematical engagement. The teacher who demonstrated content in explicit ways tended to limit the richness of the engagement; whereas the teacher who presented activities in an open-ended manner fostered engagement.

  13. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Science.gov (United States)

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  14. Experimental anti-GBM nephritis as an analytical tool for studying spontaneous lupus nephritis.

    Science.gov (United States)

    Du, Yong; Fu, Yuyang; Mohan, Chandra

    2008-01-01

    Systemic lupus erythematosus (SLE) is an autoimmune disease that results in immune-mediated damage to multiple organs. Among these, kidney involvement is the most common and fatal. Spontaneous lupus nephritis (SLN) in mouse models has provided valuable insights into the underlying mechanisms of human lupus nephritis. However, SLN in mouse models takes 6-12 months to manifest; hence there is clearly the need for a mouse model that can be used to unveil the pathogenic processes that lead to immune nephritis over a shorter time frame. In this article more than 25 different molecules are reviewed that have been studied both in the anti-glomerular basement membrane (anti-GBM) model and in SLN and it was found that these molecules influence both diseases in a parallel fashion, suggesting that the two disease settings share common molecular mechanisms. Based on these observations, the authors believe the experimental anti-GBM disease model might be one of the best tools currently available for uncovering the downstream molecular mechanisms leading to SLN.

  15. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  17. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  18. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    Science.gov (United States)

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  19. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  20. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  1. Analytical tools and functions of GIS in the process control and decision support of mining company

    Directory of Open Access Journals (Sweden)

    Semrád Peter

    2001-12-01

    Full Text Available The development of computer techniques, the increase in demands for the professional and possible fastest data processing, as well as for the fluency and efficiency of information gaining, exchanging and providing has a strong influence on forming the new generation of information technologies - Geografic Information Systems (GIS that rose in the second half of the twentieth century.Advancement in this area is still progressing and GIS gradually find the enforcement in individual fields where they play a great role in the process control and decision support. Nowadays, there are applications in mining and geology, where are used especially at processing and evaluating of mining - geological documentation, optimalization of mining and technical processes, planning, distributing and managing of mining as well as economic analysis that are important in terms of investment decisions to mining business.GIS are the systems for the effective keeping, updating, processing, analysing, modelling, simulating and presenting geographically oriented information. We can identify them as computer systems helping to solve real problems that should be normally required to solve by human expert.Well equipped GIS have graphic ability and accordingly manage descriptive (attribute data. They are able to secure mutual connection between graphical and descriptive data and in addition to command countless number of functions that enable the execution of spatial analysis. This fact is very important in mining and geological application.There are exploited mostly geostatistical analysis (e. g. modelling of distribution valuable and harmful components of mineral resouce in a mineral deposit, surface modelling and surface model analysis (e. g. at modelling the subsidence of mining territory, different methods of creating spatial and attribute queries about database for seeking necessary data (e. g. to find all mining blocks of deposit that meet required conditions and to

  2. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    Science.gov (United States)

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  4. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    Massa, R.J.

    1987-01-01

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  5. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  6. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  7. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Intratibial injection of human multiple myeloma cells in NOD/SCID IL-2Rγ(null mice mimics human myeloma and serves as a valuable tool for the development of anticancer strategies.

    Directory of Open Access Journals (Sweden)

    Julia Schueler

    Full Text Available BACKGROUND: We systematically analyzed multiple myeloma (MM cell lines and patient bone marrow cells for their engraftment capacity in immunodeficient mice and validated the response of the resulting xenografts to antimyeloma agents. DESIGN AND METHODS: Using flow cytometry and near infrared fluorescence in-vivo-imaging, growth kinetics of MM cell lines L363 and RPMI8226 and patient bone marrow cells were investigated with use of a murine subcutaneous bone implant, intratibial and intravenous approach in NOD/SCID, NOD/SCID treated with CD122 antibody and NOD/SCID IL-2Rγ(null mice (NSG. RESULTS: Myeloma growth was significantly increased in the absence of natural killer cell activity (NSG or αCD122-treated NOD/SCID. Comparison of NSG and αCD122-treated NOD/SCID revealed enhanced growth kinetics in the former, especially with respect to metastatic tumor sites which were exclusively observed therein. In NSG, MM cells were more tumorigenic when injected intratibially than intravenously. In NOD/SCID in contrast, the use of juvenile long bone implants was superior to intratibial or intravenous cancer cell injection. Using the intratibial NSG model, mice developed typical disease symptoms exclusively when implanted with human MM cell lines or patient-derived bone marrow cells, but not with healthy bone marrow cells nor in mock-injected animals. Bortezomib and dexamethasone delayed myeloma progression in L363- as well as patient-derived MM cell bearing NSG. Antitumor activity could be quantified via flow cytometry and in vivo imaging analyses. CONCLUSIONS: Our results suggest that the intratibial NSG MM model mimics the clinical situation of the disseminated disease and serves as a valuable tool in the development of novel anticancer strategies.

  9. A combination of troponin T and 12-lead electrocardiography: a valuable tool for early prediction of long-term mortality in patients with chest pain without ST-segment elevation.

    Science.gov (United States)

    Jernberg, Tomas; Lindahl, Bertil

    2002-11-01

    Electrocardiography (ECG) obtained on admission and a troponin T (tn-T) level measured early after admission are simple and accessible methods for predicting outcome in patients with suspected unstable angina or myocardial infarction without persistent ST-elevations. However, there are few studies about the combination of these 2 methods as a means of predicting long-term outcome. ECG was obtained on admission, and a tn-T level was analyzed on admission and after 6 hours in 710 consecutive patients admitted because of chest pain and no ST-elevations. Patients were observed for a median time of 40 months for death. ST-segment depressions > or =0.05 mV were present in 266 patients (37%). These patients had a 9.7-fold increased risk of death, compared with patients with normal ECG results. Isolated T-Wave inversions or pathological signs other than ST-T changes were present in 196 patients (28%), who had a 4.5-fold increased risk of death compared with patients who had normal ECG results. At 6 hours after admission, 169 patients (24%) had at least 1 sample of tn-T > or =0.10 microg/L, which resulted in an 3.7-fold increased risk of death. In a multivariate analysis, both ECG on admission and tn-T level came out as independent predictors of outcome. When these methods were combined, patients could be divided into low- (tn-T level or =0.10 microg/L or ST-segment depression), and high-risk groups (tn-T level > or =0.10 microg/L and ST-segment depression). ECG and tn-T level are valuable tools to quickly risk stratify patients with chest pain. The combination of these methods is superior to either one alone.

  10. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  11. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products.

  13. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  14. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  15. Recovering valuable shale oils, etc

    Energy Technology Data Exchange (ETDEWEB)

    Engler, C

    1922-09-26

    A process is described for the recovery of valuable shale oils or tars, characterized in that the oil shale is heated to about 300/sup 0/C or a temperature not exceeding this essentially and then is treated with a solvent with utilization of this heat.

  16. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  17. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-01-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)][M. M. Block, Eur. Phys. J. C 68, 683 (2010)] allow us to write fully decoupled solutions for the singlet structure function F s (x,Q 2 ) and G(x,Q 2 ) as F s (x,Q 2 )=F s (F s0 (x 0 ),G 0 (x 0 )) and G(x,Q 2 )=G(F s0 (x 0 ),G 0 (x 0 )), where the x 0 are the Bjorken x values at Q 0 2 . Here F s and G are known functions--found using LO DGLAP splitting functions--of the initial boundary conditions F s0 (x)≡F s (x,Q 0 2 ) and G 0 (x)≡G(x,Q 0 2 ), i.e., the chosen starting functions at the virtuality Q 0 2 . For both G(x) and F s (x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy--a computational fractional precision of O(10 -9 ). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F s distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)], starting from their initial values at Q 0 2 =1 GeV 2 and 1.69 GeV 2 , respectively, using their choice of α s (Q 2 ). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and F s satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of the starting functions on the evolved gluon and singlet structure functions, as functions of both Q

  18. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  19. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  20. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  1. Learning analytics as a tool for closing the assessment loop in higher education

    OpenAIRE

    Karen D. Mattingly; Margaret C. Rice; Zane L. Berge

    2012-01-01

    This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and wha...

  2. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  3. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  4. Cross learning synergies between Operation Management content and the use of generic analytic tools

    Directory of Open Access Journals (Sweden)

    Frederic Marimon

    2017-06-01

    By presenting both objectives simultaneously students are found to be more motivated towards working deeply in both objectives. Students know that the theoretical content will be put in practice through certain tools, strengthening the student's interest on the conceptual issues of the chapter. In turn, because students know that they will use a generic tool in a known context, their interests in these tools is reinforced. The result is a cross learning synergy.

  5. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  6. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    International Nuclear Information System (INIS)

    Silvestre-Brac, Bernard; Semay, Claude; Buisseret, Fabien

    2008-01-01

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature

  7. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre-Brac, Bernard [LPSC Universite Joseph Fourier, Grenoble 1, CNRS/IN2P3, Institut Polytechnique de Grenoble, Avenue des Martyrs 53, F-38026 Grenoble-Cedex (France); Semay, Claude; Buisseret, Fabien [Groupe de Physique Nucleaire Theorique, Universite de Mons-Hainaut, Academie universitaire Wallonie-Bruxelles, Place du Parc 20, B-7000 Mons (Belgium)], E-mail: silvestre@lpsc.in2p3.fr, E-mail: claude.semay@umh.ac.be, E-mail: fabien.buisseret@umh.ac.be

    2008-07-11

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature.

  8. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  9. Preparing valuable hydrocarbons by hydrogenation

    Energy Technology Data Exchange (ETDEWEB)

    Pier, M

    1930-08-22

    A process is described for the preparation of valuable hydrocarbons by treatment of carbonaceous materials, like coal, tars, minerals oils, and their distillation and conversion products, and for refining of liquid hydrocarbon mixture obtained at raised temperature and under pressure, preferably in the presence of catalysts, by the use of hydrogen-containing gases, purified and obtained by distilling solid combustibles, characterized by the purification of the hydrogen-containing gases being accomplished for the purpose of practically complete removal of the oxygen by heating at ordinary or higher pressure in the presence of a catalyst containing silver and oxides of metals of group VI of the periodic system.

  10. Energy threat to valuable land

    International Nuclear Information System (INIS)

    Caufield, C.

    1982-01-01

    Having considered the varying estimates of future UK energy requirements which have been made, the impact on the environment arising from the use of valuable sites for energy production is examined. It is shown that energy installations of all kinds clash with areas of natural beauty or ecological importance. As an example, a recent investigation of potential sites for nuclear power stations found that most of them were on or next to sites of special scientific interest, and other areas officially designated to be regarded as special or to be protected in some way. (U.K.)

  11. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  12. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  13. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  14. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  15. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  16. A Review on the Design Structure Matrix as an Analytical Tool for Product Development Management

    OpenAIRE

    Mokudai, Takefumi

    2006-01-01

    This article reviews fundamental concepts and analytical techniques of design structure matrix (DSM) as well as recent development of DSM studies. The DSM is a matrix representation of relationships between components of a complex system, such as products, development organizations and processes. Depending on targets of analysis, there are four basic types of DSM: Component-based DSM, Team-based DSM, Task-based DSM, and Parameter-based DSM. There are two streams of recent DSM studies: 1) ...

  17. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  18. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    Science.gov (United States)

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  19. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  20. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    Science.gov (United States)

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  1. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  2. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  3. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  4. Program to develop analytical tools for environmental and safety assessment of nuclear material shipping container systems

    International Nuclear Information System (INIS)

    Butler, T.A.

    1978-11-01

    This paper describes a program for developing analytical techniques to evaluate the response of nuclear material shipping containers to severe accidents. Both lumped-mass and finite element techniques are employed to predict shipping container and shipping container-carrier response to impact. The general impact problem is computationally expensive because of its nonlinear, three-dimensional nature. This expense is minimized by using approximate models to parametrically identify critical cases before more exact analyses are performed. The computer codes developed for solving the problem are being experimentally substantiated with test data from full-scale and scale-model container drop tests. 6 figures, 1 table

  5. MVT a most valuable theorem

    CERN Document Server

    Smorynski, Craig

    2017-01-01

    This book is about the rise and supposed fall of the mean value theorem. It discusses the evolution of the theorem and the concepts behind it, how the theorem relates to other fundamental results in calculus, and modern re-evaluations of its role in the standard calculus course. The mean value theorem is one of the central results of calculus. It was called “the fundamental theorem of the differential calculus” because of its power to provide simple and rigorous proofs of basic results encountered in a first-year course in calculus. In mathematical terms, the book is a thorough treatment of this theorem and some related results in the field; in historical terms, it is not a history of calculus or mathematics, but a case study in both. MVT: A Most Valuable Theorem is aimed at those who teach calculus, especially those setting out to do so for the first time. It is also accessible to anyone who has finished the first semester of the standard course in the subject and will be of interest to undergraduate mat...

  6. Room temperature phosphorescence in the liquid state as a tool in analytical chemistry

    International Nuclear Information System (INIS)

    Kuijt, Jacobus; Ariese, Freek; Brinkman, Udo A.Th.; Gooijer, Cees

    2003-01-01

    A wide-ranging overview of room temperature phosphorescence in the liquid state (RTPL ) is presented, with a focus on recent developments. RTPL techniques like micelle-stabilized (MS)-RTP, cyclodextrin-induced (CD)-RTP, and heavy atom-induced (HAI)-RTP are discussed. These techniques are mainly applied in the stand-alone format, but coupling with some separation techniques appears to be feasible. Applications of direct, sensitized and quenched phosphorescence are also discussed. As regards sensitized and quenched RTP, emphasis is on the coupling with liquid chromatography (LC) and capillary electrophoresis (CE), but stand-alone applications are also reported. Further, the application of RTPL in immunoassays and in RTP optosensing - the optical sensing of analytes based on RTP - is reviewed. Next to the application of RTPL in quantitative analysis, its use for the structural probing of protein conformations and for time-resolved microscopy of labelled biomolecules is discussed. Finally, an overview is presented of the various analytical techniques which are based on the closely related phenomenon of long-lived lanthanide luminescence. The paper closes with a short evaluation of the state-of-the-art in RTP and a discussion on future perspectives

  7. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., regimes.

  8. Analytical tools for managing rock fall hazards in Australian coal mine roadways

    Energy Technology Data Exchange (ETDEWEB)

    Ross Seedsman; Nick Gordon; Naj Aziz [University of Wollongong (Australia)

    2009-03-15

    This report provides a reference source for the design of ground control measures in coal mine roadways using analytical methods. Collapse models are provided for roof and rib. The roof models recognise that different collapse modes can apply in different stress fields - high, intermediate, and zero compressive stresses. The rib models draw analogies to rock slope stability and also the impact of high vertical stresses. Methods for determining support or reinforcement requirements are provided. Suspension of collapsed masses is identified as the basis for roof support in both very high and zero compressive stress regimes. Reinforcement of bedding discontinuities is advocated for intermediate compressive stresses. For the ribs, restraint of coal blocks defined by pre-existing joints or by mining induced fractures is required.

  9. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  10. PLS2 regression as a tool for selection of optimal analytical modality

    DEFF Research Database (Denmark)

    Madsen, Michael; Esbensen, Kim

    Intelligent use of modern process analysers allows process technicians and engineers to look deep into the dynamic behaviour of production systems. This opens up for a plurality of new possibilities with respect to process optimisation. Oftentimes, several instruments representing different...... technologies and price classes are able to decipher relevant process information simultaneously. The question then is: how to choose between available technologies without compromising the quality and usability of the data. We apply PLS2 modelling to quantify the relative merits of competing, or complementing......, analytical modalities. We here present results from a feasibility study, where Fourier Transform Near InfraRed (FT-NIR), Fourier Transform Mid InfraRed (FT-MIR), and Raman laser spectroscopy were applied on the same set of samples obtained from a pilot-scale beer brewing process. Quantitative PLS1 models...

  11. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  12. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  13. Review: Andrew Bennett & Jeffrey T. Checkel (Eds. (2015. Process Tracing: From Metaphor to Analytic Tool

    Directory of Open Access Journals (Sweden)

    Felix Anderl

    2015-09-01

    Full Text Available In this review, I argue that this textbook edited by BENNETT and CHECKEL is exceptionally valuable in at least four aspects. First, with regards to form, the editors provide a paragon of how an edited volume should look: well-connected articles "speak to" and build on each other. The contributors refer to and grapple with the theoretical framework of the editors who, in turn, give heed to the conclusions of the contributors. Second, the book is packed with examples from research practice. These are not only named but thoroughly discussed and evaluated for their methodological potential in all chapters. Third, the book aims at improving and popularizing process tracing, but does not shy away from systematically considering the potential weaknesses of the approach. Fourth, the book combines and bridges various approaches to (mostly qualitative methods and still manages to provide abstract and easily accessible standards for making "good" process tracing. As such, it is a must-read for scholars working with qualitative methods. However, BENNETT and CHECKEL struggle with fulfilling their promise of bridging positivist and interpretive approaches, for while they do indeed take the latter into account, their general research framework remains largely unchanged by these considerations. On these grounds, I argue that, especially for scholars in the positivist camp, the book can function as a "how-to" guide for designing and implementing research. Although this may not apply equally to interpretive researchers, the book is still a treasure chest for them, providing countless conceptual clarifications and potential pitfalls of process tracing practice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503187

  14. An analytical method on the surface residual stress for the cutting tool orientation

    Science.gov (United States)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2010-03-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  15. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  16. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  17. Thermal Lens Spectroscopy as a 'new' analytical tool for actinide determination in nuclear reprocessing processes

    International Nuclear Information System (INIS)

    Canto, Fabrice; Couston, Laurent; Magnaldo, Alastair; Broquin, Jean-Emmanuel; Signoret, Philippe

    2008-01-01

    Thermal Lens Spectroscopy (TLS) consists of measuring the effects induced by the relaxation of molecules excited by photons. Twenty years ago, the Cea already worked on TLS. Technologic reasons impeded. But, needs in sensitive analytical methods coupled with very low sample volumes (for example, traces of Np in the COEX TM process) and also the reduction of the nuclear wastes encourage us to revisit this method thanks to the improvement of optoelectronic technologies. We can also imagine coupling TLS with micro-fluidic technologies, decreasing significantly the experiments cost. Generally two laser beams are used for TLS: one for the selective excitation by molecular absorption (inducing the thermal lens) and one for probing the thermal lens. They can be coupled with different geometries, collinear or perpendicular, depending on the application and on the laser mode. Also, many possibilities of measurement have been studied to detect the thermal lens signal: interferometry, direct intensities variations, deflection etc... In this paper, one geometrical configuration and two measurements have been theoretically evaluated. For a single photodiode detection (z-scan) the limit of detection is calculated to be near 5*10 -6 mol*L -1 for Np(IV) in dodecane. (authors)

  18. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  19. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    International Nuclear Information System (INIS)

    Constantinescu, B.; Cristea-Stan, D.; Kovács, I.; Szõkefalvi-Nagy, Z.

    2013-01-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  20. An analytical tool to support the pedestrianisation process: The case of via Roma, Cagliari

    Directory of Open Access Journals (Sweden)

    Alfonso Annunziata

    2018-04-01

    Full Text Available The article focuses on the case of the modification of an urban road network: the transformation of a portion of an important distributor road in the urban area of Cagliari into a pedestrian space. By means of this case study the article aims to point out how pedestrianisation interventions have not been completely defined within a theoretical system that clearly establishes modes and conditions of implementation. This lack of theorization has led to the common understanding of pedestrianisation as good operations in and of itself and, as such, exportable, meant to produce the same effects everywhere (Bianchetti, 2016. This analysis uses the fundamental conditions of hierarchy as a tool to assess to what extent the modification of the road network articulation has resulted in conditions of lesser inter-connectivity, legibility and functionality. In this perspective the article proposes a system of criteria, founded on the principles of hierarchy, meant to be a theoretical support for processes of pedestrianisation.

  1. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    Science.gov (United States)

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  2. Thermodynamics and structure of liquid surfaces investigated directly with surface analytical tools

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Gunther [Flinders Univ., Adelaide, SA (Australia). Centre for NanoScale Science and Technology; Morgner, Harald [Leipzig Univ. (Germany). Wilhelm Ostwald Inst. for Physical and Theoretical Chemistry

    2017-06-15

    Measuring directly the composition, the distribution of constituents as function of the depth and the orientation of molecules at liquid surfaces is essential for determining physicochemical properties of liquid surfaces. While the experimental tools that have been developed for analyzing solid surfaces can in principal be applied to liquid surfaces, it turned out that they had to be adjusted to the particular challenges imposed by liquid samples, e.g. by the unavoidable vapor pressure and by the mobility of the constituting atoms/molecules. In the present work it is shown, how electron spectroscopy and ion scattering spectroscopy have been used for analyzing liquid surfaces. The emphasis of this review is on using the structural information gained for determining the physicochemical properties of liquid surfaces. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  3. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Science.gov (United States)

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P

    2012-01-01

    The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff. A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i) capture complexity, ii) analyze trends, and iii) identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%), heavy metals (10.9%) or bacteria (20.3%). Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting) as well as within (e.g. transgressing) activities. This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and transgressors, instantly revealed by our approach, could supplement the intelligence

  4. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff.A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i capture complexity, ii analyze trends, and iii identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%, heavy metals (10.9% or bacteria (20.3%. Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting as well as within (e.g. transgressing activities.This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and transgressors, instantly revealed by our approach, could supplement the

  5. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  6. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    Science.gov (United States)

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-05-01

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  8. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    Science.gov (United States)

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Boehme, E.; Musekamp, W.; Hussels, U.; Becker, G.; Behr, H.; Luettgert, H.

    1994-01-01

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP) [de

  10. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  11. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  12. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    International Nuclear Information System (INIS)

    Hervas, Miriam; Lopez, Miguel Angel; Escarpa, Alberto

    2009-01-01

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 μg L -1 and EC 50 0.079 μg L -1 were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  13. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  14. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    Directory of Open Access Journals (Sweden)

    Miroslav Pohanka

    2015-06-01

    Full Text Available Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE. The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.. It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance.

  15. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  16. Real-time imaging as an emerging process analytical technology tool for monitoring of fluid bed coating process.

    Science.gov (United States)

    Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S

    2018-07-01

    A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.

  17. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Born analytical or adopted over time? a study investigating if new analytical tools can ensure the survival of market oriented startups.

    OpenAIRE

    Skogen, Hege Janson; De la Cruz, Kai

    2017-01-01

    Masteroppgave(MSc) in Master of Science in Strategic Marketing Management - Handelshøyskolen BI, 2017 This study investigates whether the prevalence of technological advances within quantitative analytics moderates the effect market orientation has on firm performance, and if startups can take advantage of the potential opportunities to ensure their own survival. For this purpose, the authors review previous literature in marketing orientation, startups, marketing analytics, an...

  19. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    Science.gov (United States)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  20. The Application of State-of-the-Art Analytic Tools (Biosensors and Spectroscopy in Beverage and Food Fermentation Process Monitoring

    Directory of Open Access Journals (Sweden)

    Shaneel Chandra

    2017-09-01

    Full Text Available The production of several agricultural products and foods are linked with fermentation. Traditional methods used to control and monitor the quality of the products and processes are based on the use of simple chemical analysis. However, these methods are time-consuming and do not provide sufficient relevant information to guarantee the chemical changes during the process. Commonly used methods applied in the agriculture and food industries to monitor fermentation are those based on simple or single-point sensors, where only one parameter is measured (e.g., temperature or density. These sensors are used several times per day and are often the only source of data available from which the conditions and rate of fermentation are monitored. In the modern food industry, an ideal method to control and monitor the fermentation process should enable a direct, rapid, precise, and accurate determination of several target compounds, with minimal to no sample preparation or reagent consumption. Here, state-of-the-art advancements in both the application of sensors and analytical tools to monitor beverage and food fermentation processes will be discussed.

  1. High pleural fluid adenosine deaminase levels: A valuable tool for ...

    African Journals Online (AJOL)

    To determine the positive predictive value (PPV) of FADA, the frequent causes of FPs in our laboratory and the demographic characteristics of tuberculous pleural effusions (TPEs) and non-tuberculous pleural effusions (NTPEs). Methods. High FADA results generated in the past year were extracted with corresponding TB ...

  2. Social Media: Valuable Tools in Today’s Operational Environment

    Science.gov (United States)

    2011-05-04

    Lee Odden, ―Best and Worst Practices Social Media Marketing,‖ Top Rank® Online Market Blog, entry posted 12 February 2009, http://www.toprankblog.com...1001 (accessed 30 March 2011). Odden, Lee. ―Best and Worst Practices Social Media Marketing.‖ Top Rank® Online Market Blog, entry posted 12

  3. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  4. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Björklund, Anna

    2012-01-01

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  5. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  6. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    Science.gov (United States)

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  7. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  8. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    Science.gov (United States)

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of 0.05).

  9. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...... is presented encompassing protein characterization prior to and after cloning of the corresponding gene....

  10. I-HASTREAM : density-based hierarchical clustering of big data streams and its application to big graph analytics tools

    NARCIS (Netherlands)

    Hassani, M.; Spaus, P.; Cuzzocrea, A.; Seidl, T.

    2016-01-01

    Big Data Streams are very popular at now, as stirred-up by a plethora of modern applications such as sensor networks, scientific computing tools, Web intelligence, social network analysis and mining tools, and so forth. Here, the main research issue consists in how to effectively and efficiently

  11. Optimization of IC/HPLC as a rapid analytical tool for characterization of total impurities in UO2

    International Nuclear Information System (INIS)

    Kelkar, A.G.; Kapoor, Y.S.; Mahanty, B.N.; Fulzele, A.K.; Mallik, G.K.

    2007-01-01

    Use of ion chromatography in the determination of metallic and non metallic impurities has been studied and observed to be very satisfactory. In the present paper the total time was monitored in all these experiments and compared with the conventional analytical techniques. (author)

  12. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    DEFF Research Database (Denmark)

    Kaspersen, Per Skougaard; Halsnæs, Kirsten; Gregg, Jay Sterling

    . The project is one of seven initiatives proposed by KFT for 2012. The methodology report includes definitions of major concepts, an outline of an analytical structure, a presentation of models and their applicability, and the results of case studies. The work presented in this report draws on intensive...

  13. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  14. Switchgrass a valuable biomass crop for energy

    CERN Document Server

    2012-01-01

    The demand of renewable energies is growing steadily both from policy and from industry which seeks environmentally friendly feed stocks. The recent policies enacted by the EU, USA and other industrialized countries foresee an increased interest in the cultivation of energy crops; there is clear evidence that switchgrass is one of the most promising biomass crop for energy production and bio-based economy and compounds. Switchgrass: A Valuable Biomass Crop for Energy provides a comprehensive guide to  switchgrass in terms of agricultural practices, potential use and markets, and environmental and social benefits. Considering this potential energy source from its biology, breed and crop physiology to its growth and management to the economical, social and environmental impacts, Switchgrass: A Valuable Biomass Crop for Energy brings together chapters from a range of experts in the field, including a foreword from Kenneth P. Vogel, to collect and present the environmental benefits and characteristics of this a ...

  15. Vulnerability of particularly valuable areas. Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    This report is part of the scientific basis for the management plan for the North Sea and Skagerrak. The report focuses on the vulnerability of particularly valuable areas to petroleum activities, maritime transport, fisheries, land-based and coastal activities and long-range transboundary pollution. A working group with representatives from many different government agencies, headed by the Institute of Marine Research and the Directorate for Nature Management, has been responsible for drawing up the present report on behalf of the Expert Group for the North Sea and Skagerrak. The present report considers the 12 areas that were identified as particularly valuable during an earlier stage of the management plan process on the environment, natural resources and pollution. There are nine areas along the coast and three open sea areas in the North Sea that were identified according to the same predefined criteria as used for the management plans for the Barents Sea: Lofoten area and the Norwegian Sea. The most important criteria for particularly valuable areas are importance for biological production and importance for biodiversity.(Author)

  16. Vulnerability of particularly valuable areas. Summary

    International Nuclear Information System (INIS)

    2012-01-01

    This report is part of the scientific basis for the management plan for the North Sea and Skagerrak. The report focuses on the vulnerability of particularly valuable areas to petroleum activities, maritime transport, fisheries, land-based and coastal activities and long-range transboundary pollution. A working group with representatives from many different government agencies, headed by the Institute of Marine Research and the Directorate for Nature Management, has been responsible for drawing up the present report on behalf of the Expert Group for the North Sea and Skagerrak. The present report considers the 12 areas that were identified as particularly valuable during an earlier stage of the management plan process on the environment, natural resources and pollution. There are nine areas along the coast and three open sea areas in the North Sea that were identified according to the same predefined criteria as used for the management plans for the Barents Sea: Lofoten area and the Norwegian Sea. The most important criteria for particularly valuable areas are importance for biological production and importance for biodiversity.(Author)

  17. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  18. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    Directory of Open Access Journals (Sweden)

    Dong-Kyu Lee

    2014-05-01

    Full Text Available Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different detection ranges will be produced according to the choice. In this study, four analytical tools, Fourier transform near‑infrared spectroscopy (FT-NIR, 1H-nuclear magnetic resonance spectroscopy (1H‑NMR, liquid chromatography-mass spectrometry (LC-MS, and gas chromatography-mass spectroscopy (GC-MS were applied in parallel to the same samples of two popular medicinal plants (Gastrodia elata and Rehmannia glutinosa cultivated either in Korea or China. The classification abilities of four discriminant models for each plant were evaluated based on the misclassification rate and Q2 obtained from principal component analysis (PCA and orthogonal projection to latent structures-discriminant analysis (OPLS‑DA, respectively. 1H-NMR and LC-MS, which were the best techniques for G. elata and R. glutinosa, respectively, were generally preferable for origin discrimination over the others. Reasoned by integrating all the results, 1H-NMR is the most prominent technique for discriminating the origins of two plants. Nonetheless, this study suggests that preliminary screening is essential to determine the most suitable analytical tool and statistical method, which will ensure the dependability of metabolomics-based discrimination.

  19. Analytical tools for solitons and periodic waves corresponding to phonons on Lennard-Jones lattices in helical proteins

    DEFF Research Database (Denmark)

    D'ovidio, Francesco; Bohr, Henrik; Lindgård, Per-Anker

    2005-01-01

    We study the propagation of solitons along the hydrogen bonds of an alpha helix. Modeling the hydrogen and peptide bonds with Lennard-Jones potentials, we show that the solitons can appear spontaneously and have long lifetimes. Remarkably, even if no explicit solution is known for the Lennard-Jones...... potential, the solitons can be characterized analytically with a good quantitative agreement using formulas for a Toda potential with parameters fitted to the Lennard-Jones potential. We also discuss and show the robustness of the family of periodic solutions called cnoidal waves, corresponding to phonons...

  20. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    Science.gov (United States)

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  1. Analytical tools and methodologies for evaluation of residual life of contacting pressure tubes in the early generation of Indian PHWRs

    International Nuclear Information System (INIS)

    Sinha, S.K.; Madhusoodanan, K.; Rupani, B.B.; Sinha, R.K.

    2002-01-01

    In-service life of a contacting Zircaloy-2 pressure tube (PT) in the earlier generation of Indian PHWRs, is limited mainly due to the accelerated hydrogen pick-up and nucleation and growth of hydride blister(s) at the cold spot(s) formed on outside surface of pressure tube as a result of its contact with the calandria tube (CT). The activities involving development of the analytical models for simulating the degradation mechanisms leading to PT-CT contact and the methodologies for the revaluation of their safe life under such condition form the important part of our extensive programme for the life management of contacting pressure tubes. Since after the PT-CT contact, rate of hydrogen pick-up and nucleation and growth of hydride blisters govern the safe residual life of the pressure tube, two analytical models (a) hydrogen pick-up model ('HYCON') and (b) model for the nucleation and growth of hydride blister at the contact spot ('BLIST -2D') have been developed in-house to estimate the extent of degradation caused by them. Along with them, a methodology for evaluation of safe residual life has also been formulated for evaluating the safe residual life of the contacting channels. This paper gives the brief description of the models and the methodologies relevant for the contacting Zircaloy-2 pressure tubes. (author)

  2. MICROSCOPY, MICRO-CHEMISTRY AND FTIR AS ANALYTICAL TOOLS FOR IDENTIFYING TRANSPARENT FINISHES CASE STUDIES FROM ASTRA MUSEUM – SIBIU

    Directory of Open Access Journals (Sweden)

    Maria Cristina TIMAR

    2015-12-01

    Full Text Available Conservation of cultural heritage relies on scientific investigation of artefacts, a key point being identification of the original materials. In this context, besides wood species identification, investigation of finishing layers is of ultimate importance for old furniture and any other wooden objects with historic, documentary or artistic value. The present paper refers to a series of micro-destructive investigation methods applied for identification of finishing materials, namely: simple in situ and laboratory physical tests, optical microscopy, micro-chemistry and FTIR – ATR analysis. Small samples of finishing layers were taken from four furniture objects belonging to CNM ASTRA Sibiu and were analysed according to the usual procedures of the laboratories from Sibiu and Brasov. The results showed that physical tests and microscopy are useful to get basic information on the samples’ morphology and possible classes of coating materials, while micro-chemistry revealed by some successive tests more specific information on the type of finishing materials. FTIR - ATR is a rapid method of identifying the coating materials based on available reference samples or spectra. However, this is not always straightforward and preliminary physical tests of solubility are useful to select the adequate references, while micro-chemistry tests could complete the FTIR result, especially for those components of the finishing layer present in very small amounts (less than 5%, bellow the FTIR sensitivity. Corroboration of microscopy, physical and micro-chemistry tests with FTIR can provide more reliable results in terms of finishes identification and also valuable information for restoration.

  3. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    Science.gov (United States)

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  4. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  5. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Rakitin, I.D.; Malkin, S.D.; Shalia, V.V.; Fedorov, E.M.; Lebedev, N.N.; Khoudiakov, M.M.

    1999-01-01

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  6. Single-cell MALDI-MS as an analytical tool for studying intrapopulation metabolic heterogeneity of unicellular organisms.

    Science.gov (United States)

    Amantonico, Andrea; Urban, Pawel L; Fagerer, Stephan R; Balabin, Roman M; Zenobi, Renato

    2010-09-01

    Heterogeneity is a characteristic feature of all populations of living organisms. Here we make an attempt to validate a single-cell mass spectrometric method for detection of changes in metabolite levels occurring in populations of unicellular organisms. Selected metabolites involved in central metabolism (ADP, ATP, GTP, and UDP-Glucose) could readily be detected in single cells of Closterium acerosum by means of negative-mode matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS). The analytical capabilities of this approach were characterized using standard compounds. The method was then used to study populations of individual cells with different levels of the chosen metabolites. With principal component analysis and support vector machine algorithms, it was possible to achieve a clear separation of individual C. acerosum cells in different metabolic states. This study demonstrates the suitability of mass spectrometric analysis of metabolites in single cells to measure cell-population heterogeneity.

  7. Embodying resistance : a discourse analytical study of the selfie as political tool within the fourth wave of feminism

    OpenAIRE

    Barbala, Astri Moksnes

    2017-01-01

    This Master’s thesis is exploring whether the selfie can be utilised as a political tool in order to challenge the stereotypical ideas of femininity and female beauty that currently dominate the visual social media landscape. Focusing on the photo-sharing application Instagram, the emphasis is here on how the selfie can position the portrayed subject’s body as a site of resistance. By publishing images depicting their non-normative physical appearances, social media-participating feminists ar...

  8. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Skougaard Kaspersen, P.; Halsnaes, K.; Gregg, J.; Drews, M.

    2012-12-15

    In this report we provide recommendations about how more consistent studies and data can be provided based on available modelling tools and data for integrated assessment of climate change risks and adaptation options. It is concluded that integrated assessments within this area requires the use of a wide range of data and models in order to cover the full chain of elements including climate modelling, impact, risks, costs, social issues, and decision making. As an outcome of this activity a comprehensive data and modelling tool named Danish Integrated Assessment System (DIAS) has been developed, this may be used by researchers within the field. DIAS has been implemented and tested in a case study on urban flooding caused by extreme precipitation in Aarhus, and this study highlights the usefulness of integrating data, models, and methods from several disciplines into a common framework. DIAS is an attempt to describe such a framework with regards to integrated analysis of climate impacts and adaptation. The final product of the DTU KFT project ''Tool for Vulnerability analysis'' is NOT a user friendly Climate Adaptation tool ready for various types of analysis that may directly be used by decision makers and consultant on their own. Rather developed methodology and collected/available data can serve as a starting point for case specific analyses. For this reason alone this work should very much be viewed as an attempt to coordinate research, data and models outputs between different research institutes from various disciplines. It is unquestionable that there is a future need to integrate information for areas not yet included, and it is very likely that such efforts will depend on research projects conducted in different climate change adaptation areas and sectors in Denmark. (Author)

  9. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  10. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  11. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  12. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  13. Recovery and utilization of valuable metals from spent nuclear fuel. 3: Mutual separation of valuable metals

    International Nuclear Information System (INIS)

    Kirishima, K.; Shibayama, H.; Nakahira, H.; Shimauchi, H.; Myochin, M.; Wada, Y.; Kawase, K.; Kishimoto, Y.

    1993-01-01

    In the project ''Recovery and Utilization of Valuable Metals from Spent Fuel,'' mutual separation process of valuable metals recovered from spent fuel has been studied by using the simulated solution contained Pb, Ru, Rh, Pd and Mo. Pd was separated successfully by DHS (di-hexyl sulfide) solvent extraction method, while Pb was recovered selectively from the raffinate by neutralization precipitation of other elements. On the other hand, Rh was roughly separated by washing the precipitate with alkaline solution, so that Rh was refined by chelate resin CS-346. Outline of the mutual separation process flow sheet has been established of the combination of these techniques. The experimental results and the process flow sheet of mutual separation of valuable metals are presented in this paper

  14. NIR spectroscopy as a process analytical technology (PAT) tool for monitoring and understanding of a hydrolysis process.

    Science.gov (United States)

    Wu, Zhisheng; Peng, Yanfang; Chen, Wei; Xu, Bing; Ma, Qun; Shi, Xinyuan; Qiao, Yanjiang

    2013-06-01

    The use of near infrared spectroscopy was investigated as a process analytical technology to monitor the amino acids concentration profile during hydrolysis process of Cornu Bubali. A protocol was followed, including outlier selection using relationship plot of residuals versus the leverage level, calibration models using interval partial least squares and synergy interval partial least squares (SiPLS). A strategy of four robust root mean square error of predictions (RMSEP) values have been developed to assess calibration models by means of the desirability index. Furthermore, multivariate quantification limits (MQL) values of the optimum model were determined using two types of error. The SiPLS(3) models for L-proline, L-tyrosine, L-valine, L-phenylalanine and L-lysine provided excellent accuracies with RMSEP values of 0.0915 mg/mL, 0.1605 mg/mL, 0.0515 mg/mL, 0.0586 mg/mL and 0.0613 mg/mL, respectively. The MQL ranged from 90 ppm to 810 ppm, which confirmed that these models can be suitable for most applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Analytical solution of concentric two-pole Halbach cylinders as a preliminary design tool for magnetic refrigeration systems

    Science.gov (United States)

    Fortkamp, F. P.; Lozano, J. A.; Barbosa, J. R.

    2017-12-01

    This work presents a parametric analysis of the performance of nested permanent magnet Halbach cylinders intended for applications in magnetic refrigeration and heat pumping. An analytical model for the magnetic field generated by the cylinders is used to systematically investigate the influence of their geometric parameters. The proposed configuration generates two poles in the air gap between the cylinders, where active magnetic regenerators are positioned for conversion of magnetic work into cooling capacity or heat power. A sample geometry based on previous designs of magnetic refrigerators is investigated, and the results show that the magnetic field in the air gap oscillates between 0 to approximately 1 T, forming a rectified cosine profile along the circumference of the gap. Calculations of the energy density of the magnets indicate the need to operate at a low energy (particular the inner cylinder) in order to generate a magnetic profile suitable for a magnetic cooler. In practice, these low-energy regions of the magnet can be potentially replaced by soft ferromagnetic material. A parametric analysis of the air gap height has been performed, showing that there are optimal values which maximize the magnet efficiency parameter Λcool . Some combinations of cylinder radii resulted in magnetic field changes that were too small for practical purposes. No demagnetization of the cylinders has been found for the range of parameters considered.

  16. Recovering valuable metals from recycled photovoltaic modules.

    Science.gov (United States)

    Yi, Youn Kyu; Kim, Hyun Soo; Tran, Tam; Hong, Sung Kil; Kim, Myong Jun

    2014-07-01

    Recovering valuable metals such as Si, Ag, Cu, and Al has become a pressing issue as end-of-life photovoltaic modules need to be recycled in the near future to meet legislative requirements in most countries. Of major interest is the recovery and recycling of high-purity silicon (> 99.9%) for the production of wafers and semiconductors. The value of Si in crystalline-type photovoltaic modules is estimated to be -$95/kW at the 2012 metal price. At the current installed capacity of 30 GW/yr, the metal value in the PV modules represents valuable resources that should be recovered in the future. The recycling of end-of-life photovoltaic modules would supply > 88,000 and 207,000 tpa Si by 2040 and 2050, respectively. This represents more than 50% of the required Si for module fabrication. Experimental testwork on crystalline Si modules could recover a > 99.98%-grade Si product by HNO3/NaOH leaching to remove Al, Ag, and Ti and other metal ions from the doped Si. A further pyrometallurgical smelting at 1520 degrees C using CaO-CaF2-SiO2 slag mixture to scavenge the residual metals after acid leaching could finally produce > 99.998%-grade Si. A process based on HNO3/NaOH leaching and subsequent smelting is proposed for recycling Si from rejected or recycled photovoltaic modules. Implications: The photovoltaic industry is considering options of recycling PV modules to recover metals such as Si, Ag, Cu, Al, and others used in the manufacturing of the PV cells. This is to retain its "green" image and to comply with current legislations in several countries. An evaluation of potential resources made available from PV wastes and the technologies used for processing these materials is therefore of significant importance to the industry. Of interest are the costs of processing and the potential revenues gained from recycling, which should determine the viability of economic recycling of PV modules in the future.

  17. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  18. [Psychopathology and film: a valuable interaction?].

    Science.gov (United States)

    van Duppen, Z; Summa, M; Fuchs, T

    2015-01-01

    Film or film fragments are often used in psychopathology education. However, so far there have been very few articles that have discussed the benefits and limitations of using films to explain or illustrate psychopathology. Although numerous films involves psychopathology in varying degrees, it is not clear how we can use films for psychopathology education. To examine the advantages, limitations and possible methods of using film as a means of increasing our knowledge and understanding of psychiatric illnesses. We discuss five examples that illustrate the interaction of film and psychopathology. On the one hand we explain how the psychopathological concepts are used in each film and on the other hand we explain which aspects of each film are valuable aids for teaching psychopathology. The use of film makes it possible to introduce the following topics in psychopathological teaching programme: holistic psychiatric reasoning, phenomenology and the subjective experience, the recognition of psychopathological prototypes and the importance of context. There is undoubtedly an analogy between the method we have chosen for teaching psychopathology with the help of films and the holistic approach of the psychiatrist and his or her team. We believe psychopathology education can benefit from films and we would recommend our colleagues to use it in this way.

  19. PICKLED PUMPKIN IS VALUABLE FOOD PRODUCT

    Directory of Open Access Journals (Sweden)

    T. A. Sannikova

    2017-01-01

    Full Text Available One of the main directions of the food industry development is the production of functional food products. Changes in the human’s diet structure cause that none of population group does receive necessary amount of vitamins, macro and microelements in healthy routine diet. To solve this problem, food stuffs enhanced by different ingredients enable to improve the biological and food value. The pumpkin is a valuable source of such important substances as carotene and pectin. Addition of garlic and hot pepper ingredients to process of pumpkin pickling enables to enrich the products with carbohydrates, proteins, microelements, which have low or no content in the pumpkin fruit. Therefore, the study of the influence of the different quantities of garlic and hot pepper additions on chemical composition of finished product is very important. The influence of plant additions used on chemical composition of finished product had been well determined. It was shown that through increased doses of garlic and hot pepper ingredients as compared with control, the carotene and dry matter content then decreased by 1.16%-3.43% in pickled pumpkin, while the pectin content depended on added component. The highest pectin content, 0.71% was observed at addition of 10 g. garlic ingredient per 1 kg. of raw matter, that was 4.1 times higher than control. With increased addition of hot pepper ingredient the pectin accumulation was decreasing from 0.58% in control to 0.36% in variant 10g. per 1kg. of raw matter.

  20. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  1. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  2. How valuable is word of mouth?

    Science.gov (United States)

    Kumar, V; Petersen, J Andrew; Leone, Robert P

    2007-10-01

    The customers who buy the most from you are probably not your best marketers. What's more, your best marketers may be worth far more to your company than your most enthusiastic consumers. Those are the conclusions of professors Kumar and Petersen at the University of Connecticut and professor Leone at Ohio State University, who analyzed thousands of customers in research focused on a telecommunications company and a financial services firm. In this article, the authors present a straightforward tool that can be used to calculate both customer lifetime value (CLV), the worth of your customers' purchases, and customer referral value (CRV), the value of their referrals. Knowing both enables you to segment your customers into four constituent parts: those that buy a lot but are poor marketers (which they term Affluents); those that don't buy much but are very strong salespeople for your firm (Advocates); those that do both well (Champions); and those that do neither well (Misers). In a series of one-year experiments, the authors demonstrated the effectiveness of this segmentation approach. Offering purchasing incentives to Advocates, referral incentives to Affluents, and both to Misers, they were able to move significant proportions of all three into the Champions category. Both companies reaped returns on their marketing investments greater than 12-fold--more than double the normal marketing ROI for their industries. The power of this tool is its ability to help marketers decide where to focus their efforts. Rather than waste funds encouraging big spenders to spend slightly more while overlooking the power of customer evangelists who don't buy enough to seem important, you can reap much higher rewards by nudging big spenders to make referrals and urging enthusiastic proponents of your wares to buy a bit more.

  3. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  5. CorRECTreatment: a web-based decision support tool for rectal cancer treatment that uses the analytic hierarchy process and decision tree.

    Science.gov (United States)

    Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C

    2015-01-01

    The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratiodecisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.

  6. Physical-based analytical model of flexible a-IGZO TFTs accounting for both charge injection and transport

    NARCIS (Netherlands)

    Ghittorelli, M.; Torricelli, F.; Steen, J.L. van der; Garripoli, C.; Tripathi, A.; Gelinck, G.H.; Cantatore, E.; Kovacs-Vajna, Z.M.

    2016-01-01

    Here we show a new physical-based analytical model of a-IGZO TFTs. TFTs scaling from L=200 μm to L=15 μm and fabricated on plastic foil are accurately reproduced with a unique set of parameters. The model is used to design a zero-VGS inverter. It is a valuable tool for circuit design and technology

  7. Physical-based analytical model of flexible a-IGZO TFTs accounting for both charge injection and transport

    NARCIS (Netherlands)

    Ghittorelli, M.; Torricelli, F.; van der Steen, J.-L.; Garripoli, C.; Tripathi, A.K.; Gelinck, G.; Cantatore, E.; Kovács-Vajna, Z.M.

    2015-01-01

    Here we show a new physical-based analytical model of a-IGZO TFTs. TFTs scaling from L=200 μm to L=15 μm and fabricated on plastic foil are accurately reproduced with a unique set of parameters. The model is used to design a zero- VGS inverter. It is a valuable tool for circuit design and technology

  8. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  9. Analytic tools for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Vladimir A. [Moscow State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2012-07-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  10. Analytic tools for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Vladimir A.

    2012-01-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  11. Analytical tools in accelerator physics

    International Nuclear Information System (INIS)

    Litvinenko, V.N.

    2010-01-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev (Kolomensky), but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz (Landau). A large number of short notes covering various techniques are placed in the Appendices.

  12. Analytic Tools for Feynman Integrals

    CERN Document Server

    Smirnov, Vladimir A

    2012-01-01

    The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice.  This book supersedes the author’s previous Springer book “Evaluating Feynman Integrals” and its textbook version “Feynman Integral Calculus.” Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added:  One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, “Applied Asymptotic Expansions in Momenta and Masses,” by the author. This chapter describes, on t...

  13. Performance Marketing with Google Analytics Strategies and Techniques for Maximizing Online ROI

    CERN Document Server

    Tonkin, Sebastian

    2010-01-01

    An unparalleled author trio shares valuable advice for using Google Analytics to achieve your business goals. Google Analytics is a free tool used by millions of Web site owners across the globe to track how visitors interact with their Web sites, where they arrive from, and which visitors drive the most revenue and sales leads. This book offers clear explanations of practical applications drawn from the real world. The author trio of Google Analytics veterans starts with a broad explanation of performance marketing and gets progressively more specific, closing with step-by-step analysis and a

  14. Development of a new analytical tool for assessing the mutagen 2-methyl-1,4-dinitro-pyrrole in meat products by LC-ESI-MS/MS.

    Science.gov (United States)

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano

    2018-08-01

    The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  16. Development of simulation tools for improvement of measurement accuracy and efficiency in ultrasonic testing. Part 2. Development of fast simulator based on analytical approach

    International Nuclear Information System (INIS)

    Yamada, Hisao; Fukutomi, Hiroyuki; Lin, Shan; Ogata, Takashi

    2008-01-01

    CRIEPI developed a high speed simulation method to predict B scope images for crack-like defects under ultrasonic testing. This method is based on the geometrical theory of diffraction (GTD) to follow ultrasonic waves transmitted from the angle probe and with the aid of reciprocity relations to find analytical equations to express echoes received by the probe. The tip and mirror echoes from a slit with an arbitrary angle in the direction of thickness of test article and an arbitrary depth can be calculated by this method. Main object of the study is to develop a high speed simulation tool to gain B scope displays from the crack-like defect. This was achieved for the simple slits in geometry change regions by the prototype software based on the method. Fairy complete B scope images for slits could be obtained by about a minute on a current personal computer. The numerical predictions related to the surface opening slits were in excellent agreement with the relative experimental measurements. (author)

  17. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  18. Permanent foresty plots: a potentially valuable teaching resource in undergraduate biology porgrams for the Caribbean

    Science.gov (United States)

    H. Valles; C.M.S. Carrington

    2016-01-01

    There has been a recent proposal to change the way that biology is taught and learned in undergraduate biology programs in the USA so that students develop a better understanding of science and the natural world. Here, we use this new, recommended teaching– learning framework to assert that permanent forestry plots could be a valuable tool to help develop biology...

  19. Quality Indicators for Learning Analytics

    Science.gov (United States)

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  20. New Therapies Offer Valuable Options for Patients with Melanoma

    Science.gov (United States)

    Two phase III clinical trials of new therapies for patients with metastatic melanoma presented in June at the 2011 ASCO conference confirmed that vemurafenib and ipilimumab (Yervoy™) offer valuable new options for the disease.

  1. Semantic Document Image Classification Based on Valuable Text Pattern

    Directory of Open Access Journals (Sweden)

    Hossein Pourghassem

    2011-01-01

    Full Text Available Knowledge extraction from detected document image is a complex problem in the field of information technology. This problem becomes more intricate when we know, a negligible percentage of the detected document images are valuable. In this paper, a segmentation-based classification algorithm is used to analysis the document image. In this algorithm, using a two-stage segmentation approach, regions of the image are detected, and then classified to document and non-document (pure region regions in the hierarchical classification. In this paper, a novel valuable definition is proposed to classify document image in to valuable or invaluable categories. The proposed algorithm is evaluated on a database consisting of the document and non-document image that provide from Internet. Experimental results show the efficiency of the proposed algorithm in the semantic document image classification. The proposed algorithm provides accuracy rate of 98.8% for valuable and invaluable document image classification problem.

  2. Valuable Internet Advertising and Customer Satisfaction Cycle(VIACSC)

    OpenAIRE

    Muhammad Awais; Tanzila Samin; Muhammad Bilal

    2012-01-01

    Now-a-days it is very important for the business persons to attract their target customers towards their products through valuable mode of promotion and communication. Increasing use of World Wide Web has completely changed the scenario of business sector. Customized products and services, customers preferences, @ and dot com craze have elevated the importance of internet advertising. This research paper investigates valuable internet advertising which will help to enhance the value of intern...

  3. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides; Simulacion Monte Carlo: herramienta para la calibracion en determinaciones analiticas de radionucleidos

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez, E-mail: cphr@cphr.edu.cu [Centro de Proteccion e Higiene de las Radiaciones (CPHR), La Habana (Cuba)

    2013-07-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program.

  4. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  5. Chloride present in biological samples as a tool for enhancement of sensitivity in capillary zone electrophoretic analysis of anionic trace analytes

    Czech Academy of Sciences Publication Activity Database

    Křivánková, Ludmila; Pantůčková, Pavla; Gebauer, Petr; Boček, Petr; Caslavska, J.; Thormann, W.

    2003-01-01

    Roč. 24, č. 3 (2003), s. 505-517 ISSN 0173-0835 R&D Projects: GA ČR GA203/02/0023; GA ČR GA203/01/0401; GA AV ČR IAA4031103 Institutional research plan: CEZ:AV0Z4031919 Keywords : acetoacetate * capillary zone electrophoresis * chloride stacking effects Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.040, year: 2003

  6. Field Trips as Valuable Learning Experiences in Geography Courses

    Science.gov (United States)

    Krakowka, Amy Richmond

    2012-01-01

    Field trips have been acknowledged as valuable learning experiences in geography. This article uses Kolb's (1984) experiential learning model to discuss how students learn and how field trips can help enhance learning. Using Kolb's experiential learning theory as a guide in the design of field trips helps ensure that field trips contribute to…

  7. an assessment of timber trees producing valuable fruits and seeds ...

    African Journals Online (AJOL)

    User

    It is observed that most of the timber trees producing valuable fruits and seeds have low ... sector of the economy by providing major raw materials (saw logs, ... the trees also produce industrial raw materials like latex, ... villagers while avoiding some of the ecological costs of ..... enzymes of rats with carbon tetrachloride.

  8. Ravens reconcile after aggressive conflicts with valuable partners.

    Science.gov (United States)

    Fraser, Orlaith N; Bugnyar, Thomas

    2011-03-25

    Reconciliation, a post-conflict affiliative interaction between former opponents, is an important mechanism for reducing the costs of aggressive conflict in primates and some other mammals as it may repair the opponents' relationship and reduce post-conflict distress. Opponents who share a valuable relationship are expected to be more likely to reconcile as for such partners the benefits of relationship repair should outweigh the risk of renewed aggression. In birds, however, post-conflict behavior has thus far been marked by an apparent absence of reconciliation, suggested to result either from differing avian and mammalian strategies or because birds may not share valuable relationships with partners with whom they engage in aggressive conflict. Here, we demonstrate the occurrence of reconciliation in a group of captive subadult ravens (Corvus corax) and show that it is more likely to occur after conflicts between partners who share a valuable relationship. Furthermore, former opponents were less likely to engage in renewed aggression following reconciliation, suggesting that reconciliation repairs damage caused to their relationship by the preceding conflict. Our findings suggest not only that primate-like valuable relationships exist outside the pair bond in birds, but that such partners may employ the same mechanisms in birds as in primates to ensure that the benefits afforded by their relationships are maintained even when conflicts of interest escalate into aggression. These results provide further support for a convergent evolution of social strategies in avian and mammalian species.

  9. Salt Lakes of the African Rift System: A Valuable Research ...

    African Journals Online (AJOL)

    Salt Lakes of the African Rift System: A Valuable Research Opportunity for Insight into Nature's Concenrtated Multi-Electrolyte Science. JYN Philip, DMS Mosha. Abstract. The Tanzanian rift system salt lakes present significant cultural, ecological, recreational and economical values. Beyond the wealth of minerals, resources ...

  10. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  11. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  12. Analytical Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — The Analytical Labspecializes in Oil and Hydraulic Fluid Analysis, Identification of Unknown Materials, Engineering Investigations, Qualification Testing (to support...

  13. IR spectroscopy together with multivariate data analysis as a process analytical tool for in-line monitoring of crystallization process and solid-state analysis of crystalline product

    DEFF Research Database (Denmark)

    Pöllänen, Kati; Häkkinen, Antti; Reinikainen, Satu-Pia

    2005-01-01

    -ray powder diffraction (XRPD) as a reference technique. In order to fully utilize DRIFT, the application of multivariate techniques are needed, e.g., multivariate statistical process control (MSPC), principal component analysis (PCA) and partial least squares (PLS). The results demonstrate that multivariate...... Fourier transform infra red (ATR-FTIR) spectroscopy provides valuable information on process, which can be utilized for more controlled crystallization processes. Diffuse reflectance Fourier transform infra red (DRIFT-IR) is applied for polymorphic characterization of crystalline product using X......Crystalline product should exist in optimal polymorphic form. Robust and reliable method for polymorph characterization is of great importance. In this work, infra red (IR) spectroscopy is applied for monitoring of crystallization process in situ. The results show that attenuated total reflection...

  14. HPLC-ElCD: an useful tool for the pursuit of novel analytical strategies for the detection of antioxidant secondary metabolites

    Directory of Open Access Journals (Sweden)

    Castro-Gamboa Ian

    2003-01-01

    Full Text Available Living cells are continuously exposed to a variety of challenges that exert oxidative stress and are directly related with senescence and the onset of various pathological conditions such as coronary heart disease, rheumatoid arthritis and cancer. Nevertheless, living organisms have developed a complex antioxidant network to counteract reactive species that are detrimental to life. With the aim of bio-prospecting plant species from the Brazilian Cerrado and Atlantic Forest, we have established a methodology to detect secondary antioxidant metabolites in crude extracts and fractions obtained from plant species. Combining HPLC with an electrochemical detector allowed us to detect micromolecules that showed antioxidant activities in Chimarrhis turbinata (DC leaf extracts. Comparison with purified flavonoid standards led us to identify the compounds in their natural matrices giving valuable information on their antioxidant capacity.

  15. Comparison of high-resolution ultrasonic resonator technology and Raman spectroscopy as novel process analytical tools for drug quantification in self-emulsifying drug delivery systems.

    Science.gov (United States)

    Stillhart, Cordula; Kuentz, Martin

    2012-02-05

    Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Version 1.00 programmer's tools used in constructing the INEL RML/analytical radiochemistry sample tracking database and its user interface

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases

  17. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  18. Risk assessment using Analytical Hierarchy Process - Development and evaluation of a new computer-based tool; Riskvaerdering med Analytical Hierarchy Process - Utveckling och utprovning av ett nytt datorbaserat verktyg

    Energy Technology Data Exchange (ETDEWEB)

    Ritchey, Tom (Swedish Defence Research Agency, Stockholm (Sweden))

    2008-11-15

    Risk analysis concerning the management of contaminated areas involves comparing and evaluating the relationship between ecological, technical, economic and other factors, in order to determine a reasonable level of remediation. Risk analysis of this kind is a relatively new phenomenon. In order to develop methodology in this area, the Sustainable Remediation program contributes both to comprehensive risk analysis projects and to projects concentrating on specific aspects of remediation risk analysis. In the project described in this report, the Swedish Defence Research Agency (FOI) was given a grant by the Sustainable Remediation program to apply the Analytic Hierarchy Process (AHP) in order to develop a computer-aided instrument to support remediation risk analysis. AHP is one of several so-called multi-criteria decision support methods. These methods are applied in order to systematically compare and evaluate different solutions or measures, when there are many different goal criteria involved. Such criteria can be both quantitative and qualitative. The project has resulted in the development of a computer-aided instrument which can be employed to give a better structure, consistency and traceability to risk analyses for the remediation of contaminated areas. Project was carried out in two phases with two different working groups. The first phase involved the development of a generic base-model for remediation risk analysis. This was performed by a 'development group'. The second phase entailed the testing of the generic model in a specific, on-going remediation project. This was performed by a 'test group'. The remediation project in question concerned the decontamination of a closed-down sawmill in Vaeckelsaang, in the Swedish municipality of Tingsryd

  19. Comparison of Raman, NIR, and ATR FTIR spectroscopy as analytical tools for in-line monitoring of CO2 concentration in an amine gas treating process

    NARCIS (Netherlands)

    Kachko, A.; Ham, L.V. van der; Bardow, A.; Vlugt, T.J.H.; Goetheer, E.L.V.

    2016-01-01

    Chemical absorption of CO2 using aqueous amine-based solvents is one of the common approaches to control acidic gases emissions to the atmosphere. Improvement in the efficiency of industrial processes requires precise monitoring tools that fit with the specific application. Process monitoring using

  20. Helicase-dependent isothermal amplification: a novel tool in the development of molecular-based analytical systems for rapid pathogen detection.

    Science.gov (United States)

    Barreda-García, Susana; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2018-01-01

    Highly sensitive testing of nucleic acids is essential to improve the detection of pathogens, which pose a major threat for public health worldwide. Currently available molecular assays, mainly based on PCR, have a limited utility in point-of-need control or resource-limited settings. Consequently, there is a strong interest in developing cost-effective, robust, and portable platforms for early detection of these harmful microorganisms. Since its description in 2004, isothermal helicase-dependent amplification (HDA) has been successfully applied in the development of novel molecular-based technologies for rapid, sensitive, and selective detection of viruses and bacteria. In this review, we highlight relevant analytical systems using this simple nucleic acid amplification methodology that takes place at a constant temperature and that is readily compatible with microfluidic technologies. Different strategies for monitoring HDA amplification products are described. In addition, we present technological advances for integrating sample preparation, HDA amplification, and detection. Future perspectives and challenges toward point-of-need use not only for clinical diagnosis but also in food safety testing and environmental monitoring are also discussed. Graphical Abstract Expanding the analytical toolbox for the detection of DNA sequences specific of pathogens with isothermal helicase dependent amplification (HDA).

  1. Analytical tools for determination of new oral antidiabetic drugs, glitazones, gliptins, gliflozins and glinides, in bulk materials, pharmaceuticals and biological samples

    Directory of Open Access Journals (Sweden)

    Gumieniczek Anna

    2016-01-01

    Full Text Available The review presents analytical methods for determination of new oral drugs for the treatment of type 2 diabetes mellitus (T2DM, focusing on peroxisome proliferator-activated receptor gamma agonists (glitazones, dipeptidyl peptidase 4 inhibitors (gliptins and sodium/glucose co-transporter 2 inhibitors (gliflozins. Drugs derived from prandial glucose regulators, such as glinides, are considered because they are present in some new therapeutic options. The review presents analytical procedures suitable for determination of the drugs in bulk substances, such as pharmaceuticals and biological samples, including HPLC-UV, HPLC/LC-MS, TLC/HPTLC, CE/CE-MS, spectrophotometric (UV/VIS, spectrofluorimetric and electrochemical methods, taken from the literature over the past ten years (2006-2016. Some new procedures for extraction, separation and detection of the drugs, including solid phase extraction with molecularly imprinted polymers (SPE-MIP, liquid phase microextraction using porous hollow fibers (HP-LPME, HILIC chromatography, micellar mobile phases, ion mobility spectrometry (IMS and isotopically labeled internal standards, are discussed.

  2. Which energy mix for the UK (United Kingdom)? An evolutive descriptive mapping with the integrated GAIA (graphical analysis for interactive aid)–AHP (analytic hierarchy process) visualization tool

    International Nuclear Information System (INIS)

    Ishizaka, Alessio; Siraj, Sajid; Nemery, Philippe

    2016-01-01

    Although Multi-Criteria Decision Making methods have been extensively used in energy planning, their descriptive use has been rarely considered. In this paper, we add an evolutionary description phase as an extension to the AHP (analytic hierarchy process) method that helps policy makers to gain insights into their decision problems. The proposed extension has been implemented in an open-source software that allows the users to visualize the difference of opinions within a decision process, and also the evolution of preferences over time. The method was tested in a two-phase experiment to understand the evolution of opinions on energy sources. Participants were asked to provide their preferences for different energy sources for the next twenty years for the United Kingdom. They were first asked to compare the options intuitively without using any structured approach, and then were given three months to compare the same set of options after collecting detailed information on the technical, economic, environmental and social impacts created by each of the selected energy sources. The proposed visualization method allow us to quickly discover the preference directions, and also the changes in their preferences from first to second phase. The proposed tool can help policy makers in better understanding of the energy planning problems that will lead us towards better planning and decisions in the energy sector. - Highlights: • We introduce a descriptive visual analysis tool for the analytic hierarchy process. • The method has been implemented as an open-source preference elicitation tool. • We analyse user preferences in the energy sector using this method. • The tool also provides a way to visualize temporal preferences changes. • The main negative temporal shift in the ranking was found for the nuclear energy.

  3. Valuable human capital: the aging health care worker.

    Science.gov (United States)

    Collins, Sandra K; Collins, Kevin S

    2006-01-01

    With the workforce growing older and the supply of younger workers diminishing, it is critical for health care managers to understand the factors necessary to capitalize on their vintage employees. Retaining this segment of the workforce has a multitude of benefits including the preservation of valuable intellectual capital, which is necessary to ensure that health care organizations maintain their competitive advantage in the consumer-driven market. Retaining the aging employee is possible if health care managers learn the motivators and training differences associated with this category of the workforce. These employees should be considered a valuable resource of human capital because without their extensive expertise, intense loyalty and work ethic, and superior customer service skills, health care organizations could suffer severe economic repercussions in the near future.

  4. VALUABLE AND ORIENTATION FOUNDATIONS OF EDUCATIONAL SYSTEM OF THE COUNTRY

    OpenAIRE

    Vladimir I. Zagvyazinsky

    2016-01-01

    The aim of the investigation is to show that in modern market conditions it is necessary to keep humanistic valuable and orientation installations of domestic education and not to allow its slipping on a line item of utilitarian, quickly achievable, but not long-term benefits. Theoretical significance. The author emphasizes value of forming of an ideal – harmonious development of the personality – and the collectivist beginnings for disclosure of potential of each school student, a student, a...

  5. The use of a quartz crystal microbalance as an analytical tool to monitor particle/surface and particle/particle interactions under dry ambient and pressurized conditions: a study using common inhaler components.

    Science.gov (United States)

    Turner, N W; Bloxham, M; Piletsky, S A; Whitcombe, M J; Chianella, I

    2016-12-19

    Metered dose inhalers (MDI) and multidose powder inhalers (MPDI) are commonly used for the treatment of chronic obstructive pulmonary diseases and asthma. Currently, analytical tools to monitor particle/particle and particle/surface interaction within MDI and MPDI at the macro-scale do not exist. A simple tool capable of measuring such interactions would ultimately enable quality control of MDI and MDPI, producing remarkable benefits for the pharmaceutical industry and the users of inhalers. In this paper, we have investigated whether a quartz crystal microbalance (QCM) could become such a tool. A QCM was used to measure particle/particle and particle/surface interactions on the macroscale, by additions of small amounts of MDPI components, in the powder form into a gas stream. The subsequent interactions with materials on the surface of the QCM sensor were analyzed. Following this, the sensor was used to measure fluticasone propionate, a typical MDI active ingredient, in a pressurized gas system to assess its interactions with different surfaces under conditions mimicking the manufacturing process. In both types of experiments the QCM was capable of discriminating interactions of different components and surfaces. The results have demonstrated that the QCM is a suitable platform for monitoring macro-scale interactions and could possibly become a tool for quality control of inhalers.

  6. Fiscal 1997 research report. International energy use rationalization project (Analytical tool research project for energy consumption efficiency improvement in Asia); 1997 nendo kokusai energy shiyo gorika nado taisaku jigyo chosa hokokusho. Asia energy shohi koritsuka bunseki tool chosa jigyo (honpen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    Efforts have been under way to prepare inter-industry relations tables and energy data for four Asian countries, namely, China, Taiwan, Singapore and Malaysia, and a tool for energy consumption efficiency analysis has been developed and improved. In Chapter 1, energy supply and demand in the above-named four countries is reviewed on the basis of recent economic situations in these countries. In Chapter 2, bilateral inter-industry relations tables usable under the project are employed for the analysis of the economic status of each of the countries and energy transactions between them, and a method is described of converting the tables into one-nation inter-industry relations tables which meet the need of this project. In Chapter 3, national characteristics reflected on the respective energy input tables are described, and a method is shown of converting a nationally characterized unit energy table into a common unit energy input table for registration with a database. In Chapter 4, the constitution of the Asian energy consumption efficiency improvement analyzing tool and a system using the tool are explained. In Chapter 5, some examples of analyses conducted by use of the analyzing tool are shown, in which the energy saving effect and CO2 emission reduction effect are estimated for Indonesia by use of the analyzing tool. (NEDO)

  7. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  8. Process analytical technology (PAT) for biopharmaceuticals

    DEFF Research Database (Denmark)

    Glassey, Jarka; Gernaey, Krist; Clemens, Christoph

    2011-01-01

    Process analytical technology (PAT), the regulatory initiative for building in quality to pharmaceutical manufacturing, has a great potential for improving biopharmaceutical production. The recommended analytical tools for building in quality, multivariate data analysis, mechanistic modeling, novel...

  9. The Graphics Tablet - A Valuable Tool for the Digital STEM Teacher

    Science.gov (United States)

    Stephens, Jeff

    2018-04-01

    I am inspired to write this article after coming across some publications in The Physics Teacher that all hit on topics of personal interest and experience. Similarly to Christensen my goal in writing this is to encourage other physics educators to take advantage of modern technology in delivering content to students and to feel comfortable doing so. There are numerous ways in which to create screencasts and lecture videos, some of which have been addressed in other articles. I invite those interested in learning how to create these videos to contact their educational technology staff or perform some internet searches on the topic. I will focus this article on the technology that enhanced the content I was delivering to my students. I will share a bit of my journey towards creating video materials and introduce a vital piece of technology, the graphics tablet, which changed the way I communicate with my students.

  10. Is sdLDL a valuable screening tool for cardiovascular disease in ...

    African Journals Online (AJOL)

    Many patients with cardiovascular disease have their low density lipoprotein cholesterol within normal range. This raises the question about the most important lipoprotein to use as a marker of atherogenecity. In fact, small dense low density lipoprotein has recently been suggested as a strong predictor of cardiovascular ...

  11. Otolith shape as a valuable tool to evaluate the stock structure of ...

    African Journals Online (AJOL)

    Swordfish Xiphias gladius is an oceanic-pelagic species. Its population structure in the Western Indian Ocean was studied from the shape of the sagittal otoliths of 391 individuals collected from 2009 to 2014. Normalised elliptical Fourier descriptors (EFDs) were extracted automatically using TNPC software. Principal ...

  12. Aequorin chimeras as valuable tool in the measurement of Ca2+ concentration during cadmium injury

    International Nuclear Information System (INIS)

    Biagioli, M.; Pinton, P.; Scudiero, R.; Ragghianti, M.; Bucci, S.; Rizzuto, R.

    2005-01-01

    The ability of cadmium to disrupt calcium homeostasis has been known since a long time, but the precise cellular targets of its toxic action are still debated. A great problem in the interpretation of data has been associated with the ability of cadmium to strongly bind traditional calcium probes. Aequorin, the well-characterized calcium-sensitive photoprotein, was used as intracellular calcium indicator during cadmium injury in NIH 3T3 murine fibroblasts. NIH 3T3 cells were transfected with a cDNA construct containing aequorin fused to a truncated glutamate receptor, which directs the probe to the outer surface of intracellular membranes. At first, we tested if different cadmium concentrations were able to modify the rate of light emission by aequorin showing that cadmium concentrations 2+ /Ca 2+ interference. To directly investigate the role of Cd 2+ in Ca 2+ homeostasis, we have started to selectively measure the free Ca 2+ concentration in different cell compartments. Here, we report that cadmium reduces the transient free calcium signal after stimulation of cells with bradykinin. Further studies are in progress to clarify the role of mitochondria and endoplasmic reticulum in cadmium-induced alterations of Ca 2+ homeostasis in order to link signal transduction modifications with the onset of apoptosis induced by cadmium exposure

  13. The Mini-Trial: A Valuable Alternative Dispute Resolution Tool for the United States Navy

    National Research Council Canada - National Science Library

    Morgan, Steven

    1997-01-01

    In order to avoid unnecessary, time consuming, and costly litigation, the Department of Defense, and more specifically the United States Navy, has adopted the use of alternative dispute resolution (ADR...

  14. Imaging of peripheral arteries by 16-slice computed tomography angiography: a valuable tool

    International Nuclear Information System (INIS)

    Mishra, A.; Ehtuish, Ehtuish F.

    2007-01-01

    To evaluate the efficacy of multidetector (16-row) computed tomography (MDCT) in imaging the upper and lower limb arterial tree in trauma and peripheral vascular disease. Thirty three patients underwent multislice computed tomography angiography (MSCTA) of the upper or the lower limb on multislice (16-slice) CT scanner between November 2004 and July 2005 in the Department of Radiology, National Organ Transplant Center, Tripoli, Libya. The findings were retrospectively compared with the surgical outcome in cases of trauma with suspected arterial injuries; or color Doppler correlation was obtained, for patients of peripheral vascular disease. Multislice computed tomography angiography allows a comprehensive diagnostic work-up in all trauma cases with suspected arterial injuries. In 23 cases of peripheral vascular diseases, MSCTA adequately demonstrated the presence of any stenosis or occlusion, its degree and extent, the presence of collaterals and distal reformation if any; the presence of plaques. Our experience of computed tomography angiography with 16-row MDCT scanner has clearly demonstrated its efficacy as a promising, new, fast, accurate, safe and non-invasive imaging modality of choice in cases of trauma with suspected arterial injuries; and as a useful screening modality in cases of peripheral vascular disease for diagnosis and for grading. (author)

  15. Sea sand disruption method (SSDM) as a valuable tool for isolating essential oil components from conifers.

    Science.gov (United States)

    Dawidowicz, Andrzej L; Czapczyńska, Natalia B

    2011-11-01

    Essential oils are one of nature's most precious gifts with surprisingly potent and outstanding properties. Coniferous oils, for instance, are nowadays being used extensively to treat or prevent many types of infections, modify immune responses, soothe inflammations, stabilize moods, and to help ease all forms of non-acute pain. Given the broad spectrum of usage of coniferous essential oils, a fast, safe, simple, and efficient sample-preparation method is needed in the estimation procedure of essential oil components in fresh plant material. Generally, the time- and energy-consuming steam distillation (SD) is applied for this purpose. This paper will compare SD, pressurized liquid extraction (PLE), matrix solid-phase dispersion (MSPD), and the sea sand disruption method (SSDM) as isolation techniques to obtain aroma components from Scots pine (Pinus sylvestris), spruce (Picea abies), and Douglas fir (Pseudotsuga menziesii). According to the obtained data, SSDM is the most efficient sample preparation method in determining the essential oil composition of conifers. Moreover, SSDM requires small organic solvent amounts and a short extraction time, which makes it an advantageous alternative procedure for the routine analysis of coniferous oils. The superiority of SSDM over MSPD efficiency is ascertained, as there are no chemical interactions between the plant cell components and the sand. This fact confirms the reliability and efficacy of SSDM for the analysis of volatile oil components. Copyright © 2011 Verlag Helvetica Chimica Acta AG, Zürich.

  16. Serum biomarkers reflecting specific tumor tissue remodeling processes are valuable diagnostic tools for lung cancer

    International Nuclear Information System (INIS)

    Willumsen, Nicholas; Bager, Cecilie L; Leeming, Diana J; Smith, Victoria; Christiansen, Claus; Karsdal, Morten A; Dornan, David; Bay-Jensen, Anne-Christine

    2014-01-01

    Extracellular matrix (ECM) proteins, such as collagen type I and elastin, and intermediate filament (IMF) proteins, such as vimentin are modified and dysregulated as part of the malignant changes leading to disruption of tissue homeostasis. Noninvasive biomarkers that reflect such changes may have a great potential for cancer. Levels of matrix metalloproteinase (MMP) generated fragments of type I collagen (C1M), of elastin (ELM), and of citrullinated vimentin (VICM) were measured in serum from patients with lung cancer (n = 40), gastrointestinal cancer (n = 25), prostate cancer (n = 14), malignant melanoma (n = 7), chronic obstructive pulmonary disease (COPD) (n = 13), and idiopathic pulmonary fibrosis (IPF) (n = 10), as well as in age-matched controls (n = 33). The area under the receiver operating characteristics (AUROC) was calculated and a diagnostic decision tree generated from specific cutoff values. C1M and VICM were significantly elevated in lung cancer patients as compared with healthy controls (AUROC = 0.98, P < 0.0001) and other cancers (AUROC = 0.83 P < 0.0001). A trend was detected when comparing lung cancer with COPD+IPF. No difference could be seen for ELM. Interestingly, C1M and VICM were able to identify patients with lung cancer with a positive predictive value of 0.9 and an odds ratio of 40 (95% CI = 8.7–186, P < 0.0001). Biomarkers specifically reflecting degradation of collagen type I and citrullinated vimentin are applicable for lung cancer patients. Our data indicate that biomarkers reflecting ECM and IMF protein dysregulation are highly applicable in the lung cancer setting. We speculate that these markers may aid in diagnosing and characterizing patients with lung cancer

  17. Tying Profit to Performance: A Valuable Tool, But Use With Good Judgment

    Science.gov (United States)

    2015-06-01

    profitability on behalf of its shareholders and/or owners—that’s capital- ism. Our job is to protect the interests of the taxpayers and the warfighter while...business, then it was not good. Profit is the fundamental reason that businesses exist: to make money for their owners or shareholders . Without...performance of the service, or we may only be interested in controlling cost at a set level of performance. As we emphasized in BBP 2.0, we have to start

  18. The mixed dentition pantomogram: a valuable dental development assessment tool for the dentist.

    Science.gov (United States)

    Hudson, A P G; Harris, A M P; Mohamed, N

    2009-11-01

    The mixed dentition pantomogram is routinely used in paediatric patients. This paper discusses the value of the pantomogram for early identification of problems in dental development during the mixed dentition stage. Aspects regarding dental maturity, leeway space, the sequence of eruption of the permanent teeth, anomalies and the development of the canines will be reviewed.

  19. Template mediated protein self-assembly as a valuable tool in regenerative therapy.

    Science.gov (United States)

    Kundu, B; Eltohamy, M; Yadavalli, V K; Reis, R L; Kim, H W

    2018-04-11

    The assembly of natural proteinaceous biopolymers into macro-scale architectures is of great importance in synthetic biology, soft-material science and regenerative therapy. The self-assembly of protein tends to be limited due to anisotropic interactions among protein molecules, poor solubility and stability. Here, we introduce a unique platform to self-immobilize diverse proteins (fibrous and globular, positively and negatively charged, low and high molecular weight) using silicon surfaces with pendant -NH 2 groups via a facile one step diffusion limited aggregation (DLA) method. All the experimental proteins (type I collagen, bovine serum albumin and cytochrome C) self-assemble into seaweed-like branched dendritic architectures via classical DLA in the absence of any electrolytes. The notable differences in branching architectures are due to dissimilarities in protein colloidal sub-units, which is typical for each protein type, along with the heterogeneous distribution of surface -NH 2 groups. Fractal analysis of assembled structures is used to explain the underlying route of fractal deposition; which concludes how proteins with different functionality can yield similar assembly. Further, the nano-micro-structured surfaces can be used to provide functional topographical cues to study cellular responses, as demonstrated using rat bone marrow stem cells. The results indicate that the immobilization of proteins via DLA does not affect functionality, instead serving as topographical cues to guide cell morphology. This indicates a promising design strategy at the tissue-material interface and is anticipated to guide future surface modifications. A cost-effective standard templating strategy is therefore proposed for fundamental and applied particle aggregation studies, which can be used at multiple length scales for biomaterial design and surface reformation.

  20. Is sdLDL a valuable screening tool for cardiovascular disease in ...

    African Journals Online (AJOL)

    Radwa Momtaz Abdelsamie Zaki Khalil

    Lipoprotein Cholesterol; LDL I, large buoyant LDL; LDL II, intermediate density LDL; LDL III, smaller dense LDL; .... triglycerides >_150 mg, high density lipoprotein (HDL) <40 mg/dl in men ... sion of phenotype B.4,12 For a given triglyceride level, women were .... that sdLDL /LDL ratio is a very strong predictor of CHD in men;.

  1. The Graphics Tablet--A Valuable Tool for the Digital STEM Teacher

    Science.gov (United States)

    Stephens, Jeff

    2018-01-01

    I am inspired to write this article after coming across some publications in "The Physics Teacher" that all hit on topics of personal interest and experience. Similarly to Christensen my goal in writing this is to encourage other physics educators to take advantage of modern technology in delivering content to students and to feel…

  2. Umbilical cord blood lactate: a valuable tool in the assessment of fetal metabolic acidosis

    DEFF Research Database (Denmark)

    Gjerris, Anne Cathrine Roslev; Staer-Jensen, Jette; Jørgensen, Jan Stener

    2008-01-01

    The aim of the present study was (1) to evaluate the relationship between umbilical cord arterial blood lactate and pH, standard base excess (SBE), and actual base excess (ABE) at delivery and (2) to suggest a cut-off level of umbilical cord arterial blood lactate in predicting fetal asphyxia usi...... ROC-curves, where an ABE value less than -12 was used as "gold standard" for significant intrapartum asphyxia.......The aim of the present study was (1) to evaluate the relationship between umbilical cord arterial blood lactate and pH, standard base excess (SBE), and actual base excess (ABE) at delivery and (2) to suggest a cut-off level of umbilical cord arterial blood lactate in predicting fetal asphyxia using...

  3. Umbilical cord blood lactate: a valuable tool in the assessment of fetal metabolic acidosis

    DEFF Research Database (Denmark)

    Gjerris, A.C.; Staer-Jensen, J.; Jorgensen, J.S.

    2008-01-01

    asphyxia using ROC-curves, where an ABE value less than -12 was used as "gold standard" for significant intrapartum asphyxia. STUDY DESIGN: This is a descriptive study of umbilical cord arterial blood samples from 2554 singleton deliveries. The deliveries took place at the Department of Obstetrics......OBJECTIVE: The aim of the present study was (1) to evaluate the relationship between umbilical cord arterial blood lactate and pH, standard base excess (SBE), and actual base excess (ABE) at delivery and (2) to suggest a cut-off level of umbilical cord arterial blood lactate in predicting fetal...

  4. Measured parental height in Turner syndrome-a valuable but underused diagnostic tool.

    Science.gov (United States)

    Ouarezki, Yasmine; Cizmecioglu, Filiz Mine; Mansour, Chourouk; Jones, Jeremy Huw; Gault, Emma Jane; Mason, Avril; Donaldson, Malcolm D C

    2018-02-01

    Early diagnosis of Turner syndrome (TS) is necessary to facilitate appropriate management, including growth promotion. Not all girls with TS have overt short stature, and comparison with parental height (Ht) is needed for appropriate evaluation. We examined both the prevalence and diagnostic sensitivity of measured parental Ht in a dedicated TS clinic between 1989 and 2013. Lower end of parental target range (LTR) was calculated as mid-parental Ht (correction factor 12.5 cm minus 8.5 cm) and converted to standard deviation scores (SDS) using UK 1990 data, then compared with patient Ht SDS at first accurate measurement aged > 1 year. Information was available in 172 girls of whom 142 (82.6%) were short at first measurement. However, both parents had been measured in only 94 girls (54.6%). In 92 of these girls age at measurement was 6.93 ± 3.9 years, Ht SDS vs LTR SDS - 2.63 ± 0.94 vs - 1.77 ± 0.81 (p Turner syndrome are short in relation to parental heights, with untreated final height approximately 20 cm below female population mean. • Measured parental height is more accurate than reported height. What is New: • In a dedicated Turner clinic, there was 85% sensitivity when comparing patient height standard deviation score at first accurate measurement beyond 1 year of age with the lower end of the parental target range standard deviation. • However, measured height in both parents had been recorded in only 54.6% of the Turner girls attending the clinic. This indicates the need to improve the quality of growth assessment in tertiary care.

  5. Assessing Speech Intelligibility in Children with Hearing Loss: Toward Revitalizing a Valuable Clinical Tool

    Science.gov (United States)

    Ertmer, David J.

    2011-01-01

    Background: Newborn hearing screening, early intervention programs, and advancements in cochlear implant and hearing aid technology have greatly increased opportunities for children with hearing loss to become intelligible talkers. Optimizing speech intelligibility requires that progress be monitored closely. Although direct assessment of…

  6. Robots provide valuable tools for waste processing at Millstone Nuclear Power Station

    International Nuclear Information System (INIS)

    Miles, K.; Volpe, K.

    1997-01-01

    The Millstone nuclear power station has begun an aggressive program to use robotics, which when properly used minimizes operating costs and exposure to personnel. This article describes several new ways of using existing robotic equipment to speed up work processes and provide solutions to difficult problems. The moisture separator pit and liquid radwaste are discussed

  7. Implementation of the SFRA method as valuable tool for detection of power transformer active part deformation

    Directory of Open Access Journals (Sweden)

    Milić Saša D.

    2014-01-01

    Full Text Available The paper presents the SFRA (Sweep Frequency Response Analysis-SFRA method for analyzing frequency response of transformer windings in order to identify potential defects in the geometry of the core and winding. The most frequent problems (recognized by SFRA are: core shift, shorted or open winding, unwanted contact between core and mass, etc. Comparative analysis of this method with conventional methods is carried out in situ transformer in real hard industrial conditions. Benefits of SFRA method are great reliability and repeatability of the measurements. This method belongs to the non-invasive category. Due to the high reliability and repeatability of the measurements it is very suitable for detection of changes in the geometry of the coil and the core during prophylactic field testing, or after transporting the transformer.

  8. Electronic health records: a valuable tool for dental school strategic planning.

    Science.gov (United States)

    Filker, Phyllis J; Cook, Nicole; Kodish-Stav, Jodi

    2013-05-01

    The objective of this study was to investigate if electronic patient records have utility in dental school strategic planning. Electronic health records (EHRs) have been used by all predoctoral students and faculty members at Nova Southeastern University's College of Dental Medicine (NSU-CDM) since 2006. The study analyzed patient demographic and caries risk assessment data from October 2006 to May 2011 extracted from the axiUm EHR database. The purpose was to determine if there was a relationship between high oral health care needs and patient demographics, including gender, age, and median income of the zip code where they reside in order to support dental school strategic planning including the locations of future satellite clinics. The results showed that about 51 percent of patients serviced by the Broward County-based NSU-CDM oral health care facilities have high oral health care needs and that about 60 percent of this population resides in zip codes where the average income is below the median income for the county ($41,691). The results suggest that EHR data can be used adjunctively by dental schools when proposing potential sites for satellite clinics and planning for future oral health care programming.

  9. PCR methodology as a valuable tool for identification of endodontic pathogens.

    Science.gov (United States)

    Siqueira, José F; Rôças, Isabela N

    2003-07-01

    This paper reviews the principles of polymerase chain reaction (PCR) methodology, its application in identification of endodontic pathogens and the perspectives regarding the knowledge to be reached with the use of this highly sensitive, specific and accurate methodology as a microbial identification test. Studies published in the medical, dental and biological literature. Evaluation of published epidemiological studies examining the endodontic microbiota through PCR methodology. PCR technology has enabled the detection of bacterial species that are difficult or even impossible to culture as well as cultivable bacterial strains showing a phenotypically divergent or convergent behaviour. Moreover, PCR is more rapid, much more sensitive, and more accurate when compared with culture. Its use in endodontics to investigate the microbiota associated with infected root canals has expanded the knowledge on the bacteria involved in the pathogenesis of periradicular diseases. For instance, Tannerella forsythensis (formerly Bacteroides forsythus), Treponema denticola, other Treponema species, Dialister pneumosintes, and Prevotella tannerae were detected in infected root canals for the first time and in high prevalence when using PCR analysis. The diversity of endodontic microbiota has been demonstrated by studies using PCR amplification, cloning and sequencing of the PCR products. Moreover, other fastidious bacterial species, such as Porphyromonas endodontalis, Porphyromonas gingivalis and some Eubacterium spp., have been reported in endodontic infections at a higher prevalence than those reported by culture procedures.

  10. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  11. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  12. High-throughput characterization of sediment organic matter by pyrolysis-gas chromatography/mass spectrometry and multivariate curve resolution: A promising analytical tool in (paleo)limnology.

    Science.gov (United States)

    Tolu, Julie; Gerber, Lorenz; Boily, Jean-François; Bindler, Richard

    2015-06-23

    Molecular-level chemical information about organic matter (OM) in sediments helps to establish the sources of OM and the prevalent degradation/diagenetic processes, both essential for understanding the cycling of carbon (C) and of the elements associated with OM (toxic trace metals and nutrients) in lake ecosystems. Ideally, analytical methods for characterizing OM should allow high sample throughput, consume small amounts of sample and yield relevant chemical information, which are essential for multidisciplinary, high-temporal resolution and/or large spatial scale investigations. We have developed a high-throughput analytical method based on pyrolysis-gas chromatography/mass spectrometry and automated data processing to characterize sedimentary OM in sediments. Our method consumes 200 μg of freeze-dried and ground sediment sample. Pyrolysis was performed at 450°C, which was found to avoid degradation of specific biomarkers (e.g., lignin compounds, fresh carbohydrates/cellulose) compared to 650°C, which is in the range of temperatures commonly applied for environmental samples. The optimization was conducted using the top ten sediment samples of an annually resolved sediment record (containing 16-18% and 1.3-1.9% of total carbon and nitrogen, respectively). Several hundred pyrolytic compound peaks were detected of which over 200 were identified, which represent different classes of organic compounds (i.e., n-alkanes, n-alkenes, 2-ketones, carboxylic acids, carbohydrates, proteins, other N compounds, (methoxy)phenols, (poly)aromatics, chlorophyll and steroids/hopanoids). Technical reproducibility measured as relative standard deviation of the identified peaks in triplicate analyses was 5.5±4.3%, with 90% of the RSD values within 10% and 98% within 15%. Finally, a multivariate calibration model was calculated between the pyrolytic degradation compounds and the sediment depth (i.e., sediment age), which is a function of degradation processes and changes in OM

  13. Valuable metals - recovery processes, current trends, and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Froehlich, Peter; Lorenz, Tom; Martin, Gunther; Brett, Beate; Bertau, Martin [Institut fuer Technische Chemie, TU Bergakademie Freiberg, Leipziger Strasse 29, 09599, Freiberg (Germany)

    2017-03-01

    This Review provides an overview of valuable metals, the supply of which has been classified as critical for Europe. Starting with a description of the current state of the art, novel approaches for their recovery from primary resources are presented as well as recycling processes. The focus lies on developments since 2005. Chemistry strategies which are used in metal recovery are summarized on the basis of the individual types of deposit and mineral. In addition, the economic importance as well as utilization of the metals is outlined. (copyright 2017 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim)

  14. Extraction of toxic and valuable metals from foundry sands

    International Nuclear Information System (INIS)

    Vite T, J.

    1996-01-01

    There were extracted valuable metals from foundry sands such as: gold, platinum, silver, cobalt, germanium, nickel and zinc among others, as well as highly toxic metals such as chromium, lead, vanadium and arsenic. The extraction efficiency was up to 100% in some cases. For this reason there were obtained two patents at the United States, patent number 5,356,601, in October 1994, given for the developed process and patent number 5,376,000, in December 1994, obtained for the equipment employed. Therefore, the preliminary parameters for the installation of a pilot plant have also been developed. (Author)

  15. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  16. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for quantitation of Benazepril alone and in combination with Amlodipine.

    Science.gov (United States)

    Farouk, M; Elaziz, Omar Abd; Tawakkol, Shereen M; Hemdan, A; Shehata, Mostafa A

    2014-04-05

    Four simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the determination of Benazepril (BENZ) alone and in combination with Amlodipine (AML) in pharmaceutical dosage form. The first method is pH induced difference spectrophotometry, where BENZ can be measured in presence of AML as it showed maximum absorption at 237nm and 241nm in 0.1N HCl and 0.1N NaOH, respectively, while AML has no wavelength shift in both solvents. The second method is the new Extended Ratio Subtraction Method (EXRSM) coupled to Ratio Subtraction Method (RSM) for determination of both drugs in commercial dosage form. The third and fourth methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 2-30μg/mL for BENZ in difference and extended ratio subtraction spectrophotometric method, and 5-30 for AML in EXRSM method, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Headspace solid-phase microextraction (HS-SPME) combined with GC-MS as a process analytical technology (PAT) tool for monitoring the cultivation of C. tetani.

    Science.gov (United States)

    Ghader, Masoud; Shokoufi, Nader; Es-Haghi, Ali; Kargosha, Kazem

    2018-04-15

    Vaccine production is a biological process in which variation in time and output is inevitable. Thus, the application of Process Analytical Technologies (PAT) will be important in this regard. Headspace solid - phase microextraction (HS-SPME) coupled with GC-MS can be used as a PAT for process monitoring. This method is suitable to chemical profiling of volatile organic compounds (VOCs) emitted from microorganisms. Tetanus is a lethal disease caused by Clostridium tetani (C. tetani) bacterium and vaccination is an ultimate way to prevent this disease. In this paper, SPME fiber was used for the investigation of VOCs emerging from C. tetani during cultivation. Different types of VOCs such as sulfur-containing compounds were identified and some of them were selected as biomarkers for bioreactor monitoring during vaccine production. In the second step, the portable dynamic air sampling (PDAS) device was used as an interface for sampling VOCs by SPME fibers. The sampling procedure was optimized by face-centered central composite design (FC-CCD). The optimized sampling time and inlet gas flow rates were 10 min and 2 m L s -1 , respectively. PDAS was mounted in exhausted gas line of bioreactor and 42 samples of VOCs were prepared by SPME fibers in 7 days during incubation. Simultaneously, pH and optical density (OD) were evaluated to cultivation process which showed good correlations with the identified VOCs (>80%). This method could be used for VOCs sampling from off-gas of a bioreactor to monitoring of the cultivation process. Copyright © 2018. Published by Elsevier B.V.

  18. SWOT-AHP as an inclusive analytical tool of the forest-wood-energy chain: the case study of the Sarntal (South Tyrol

    Directory of Open Access Journals (Sweden)

    Nikodinoska N

    2015-12-01

    Full Text Available In the last years, the use of forest biomass for energy purpose is steadily increasing to tackle energy security issues and to mitigate climate change by stabilizing greenhouse gases (GHG concentrations in the atmosphere. In Italy, the new National Energy Strategy established that the renewable energy must cover 20% of gross energy demand by 2020. In order to achieve this objective the forest biomass could be of fundamental importance. In this context of increasing extraction of wood residues from forests, it is relevant to analyse two key aspects: (1 the involvement of stakeholders in the strategy for the valorization of forest-wood-energy chain at local level; and (2 the potential impacts of increased forest biomass extraction on environment. This paper analyses these two aspects through the stakeholders’ opinions in a case study in the Alto Adige (Sarentino valley. Stakeholders’ opinions concerning the analysis of SWOT categories (strengths, weaknesses, opportunities, threats of the bioenergy supply chain were investigated using the SWOT-AHP (Analytical Hierarchy Process approach. The results show that the local stakeholders emphasize some strengths (e.g., additional income over time for private forest owners and opportunities (e.g., development of shared forest management strategies among small forest owners of forest-wood-energy chain, and consider less relevant the weaknesses and threats. The results concerning one of most important potential threats - impacts on environment - show that all groups of stakeholders (public administrations, associations and NGO, research bodies and universities, and actors of rural sector consider positive the impacts of increased forest biomass extraction on recreational activities and negative on other three ecosystem services (carbon sequestration, hydrogeological protection, and biodiversity.

  19. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  20. Analytical quadrics

    CERN Document Server

    Spain, Barry; Ulam, S; Stark, M

    1960-01-01

    Analytical Quadrics focuses on the analytical geometry of three dimensions. The book first discusses the theory of the plane, sphere, cone, cylinder, straight line, and central quadrics in their standard forms. The idea of the plane at infinity is introduced through the homogenous Cartesian coordinates and applied to the nature of the intersection of three planes and to the circular sections of quadrics. The text also focuses on paraboloid, including polar properties, center of a section, axes of plane section, and generators of hyperbolic paraboloid. The book also touches on homogenous coordi

  1. Parasites as valuable stock markers for fisheries in Australasia, East Asia and the Pacific Islands.

    Science.gov (United States)

    Lester, R J G; Moore, B R

    2015-01-01

    Over 30 studies in Australasia, East Asia and the Pacific Islands region have collected and analysed parasite data to determine the ranges of individual fish, many leading to conclusions about stock delineation. Parasites used as biological tags have included both those known to have long residence times in the fish and those thought to be relatively transient. In many cases the parasitological conclusions have been supported by other methods especially analysis of the chemical constituents of otoliths, and to a lesser extent, genetic data. In analysing parasite data, authors have applied multiple different statistical methodologies, including summary statistics, and univariate and multivariate approaches. Recently, a growing number of researchers have found non-parametric methods, such as analysis of similarities and cluster analysis, to be valuable. Future studies into the residence times, life cycles and geographical distributions of parasites together with more robust analytical methods will yield much important information to clarify stock structures in the area.

  2. Paper Microzone Plates as Analytical Tools for Studying Enzyme Stability: A Case Study on the Stabilization of Horseradish Peroxidase Using Trehalose and SU-8 Epoxy Novolac Resin.

    Science.gov (United States)

    Ganaja, Kirsten A; Chaplan, Cory A; Zhang, Jingyi; Martinez, Nathaniel W; Martinez, Andres W

    2017-05-16

    Paper microzone plates in combination with a noncontact liquid handling robot were demonstrated as tools for studying the stability of enzymes stored on paper. The effect of trehalose and SU-8 epoxy novolac resin (SU-8) on the stability of horseradish peroxidase (HRP) was studied in both a short-term experiment, where the activity of various concentrations of HRP dried on paper were measured after 1 h, and a long-term experiment, where the activity of a single concentration of HRP dried and stored on paper was monitored for 61 days. SU-8 was found to stabilize HRP up to 35 times more than trehalose in the short-term experiment for comparable concentrations of the two reagents, and a 1% SU-8 solution was found to stabilize HRP approximately 2 times more than a 34% trehalose solution in both short- and long-term experiments. The results suggest that SU-8 is a promising candidate for use as an enzyme-stabilizing reagent for paper-based diagnostic devices and that the short-term experiment could be used to quickly evaluate the capacity of various reagents for stabilizing enzymes to identify and characterize new enzyme-stabilizing reagents.

  3. Animals as an indicator of carbon sequestration and valuable landscapes

    Directory of Open Access Journals (Sweden)

    Jan Szyszko

    2011-05-01

    Full Text Available Possibilities of the assessment of a landscape with the use of succession development stages, monitored with the value of the Mean Individual Biomass (MIB of carabid beetles and the occurrence of bird species are discussed on the basis of an example from Poland. Higher variability of the MIB value in space signifies a greater biodiversity. Apart from the variability of MIB, it is suggested to adopt the occurrence of the following animals as indicators, (in the order of importance, representing underlying valuable landscapes: black stork, lesser spotted eagle, white-tailed eagle, wolf, crane and white stork. The higher number of these species and their greater density indicate a higher value of the landscape for biodiversity and ecosystem services, especially carbon sequestration. All these indicators may be useful to assess measures for sustainable land use.

  4. Metagenomes provide valuable comparative information on soil microeukaryotes

    DEFF Research Database (Denmark)

    Jacquiod, Samuel Jehan Auguste; Stenbæk, Jonas; Santos, Susana

    2016-01-01

    has been identified. Our analyses suggest that publicly available metagenome data can provide valuable information on soil microeukaryotes for comparative purposes when handled appropriately, complementing the current view provided by ribosomal amplicon sequencing methods......., providing microbiologists with substantial amounts of accessible information. We took advantage of public metagenomes in order to investigate microeukaryote communities in a well characterized grassland soil. The data gathered allowed the evaluation of several factors impacting the community structure......, including the DNA extraction method, the database choice and also the annotation procedure. While most studies on soil microeukaryotes are based on sequencing of PCR-amplified taxonomic markers (18S rRNA genes, ITS regions), this work represents, to our knowledge, the first report based solely...

  5. CORRELATION LINKS BETWEEN SOME ECONOMICALLY VALUABLE SIGNS IN BROCCOLI

    Directory of Open Access Journals (Sweden)

    E. A. Zablotskaya

    2018-01-01

    Full Text Available The study of the correlation relationship between the signs, the informativeness of the indicators makes it possible to conduct a preliminary assessment of the plants and more objectively to identify forms with high economically valuable characteristics. Their integrated assessment will identify the best source material for further selection. In literary sources, information on the correlation in broccoli between yields and its elements are not the same. The purpose of our study was to analyze the contingency of various traits and to identify significant correlation links between quantitative traits in broccoli hybrids (42 samples. They were obtained using doubled haploid lines (DH-line of early maturity at 2 planting dates (spring and summer. Studies were conducted in the Odintsovo district of the Moscow region in field experience in 2015, 2016. Significant influence on growth and development was provided by the developing weather conditions during the growing period. The fluctuation of humidification and temperature conditions differed significantly during the years of study and the time of planting, which is an important circumstance for analyzing the data obtained. Based on the results of the research, it was concluded that the value of the correlation coefficient and the strength of the correlation relationship between the characteristics (mass, diameter, head height, plant height, vegetation period are different and depend on the set of test specimens and growing conditions. A significant stable manifestation of positive correlation was revealed during all the years of research and the time of planting between the diameter and mass of the head (r = 0.45-0.96. The variability of the correlation of other economically valuable traits is marked. 

  6. Quality system implementation for nuclear analytical techniques

    International Nuclear Information System (INIS)

    2004-01-01

    The international effort (UNIDO, ILAC, BIPM, etc.) to establish a functional infrastructure for metrology and accreditation in many developing countries needs to be complemented by assistance to implement high quality practices and high quality output by service providers and producers in the respective countries. Knowledge of how to approach QA systems that justify a formal accreditation is available in only a few countries and the dissemination of know how and development of skills is needed bottom up from the working level of laboratories and institutes. Awareness building, convincing of management, introduction of good management practices, technical expertise and good documentation will lead to the creation of a quality culture that assures a sustainability and inherent development of quality practices as a prerequisite of economic success. Quality assurance and quality control can be used as a valuable management tool and is a prerequisite for international trade and information exchange. This publication tries to assist quality managers, Laboratory Managers and staff involved in setting up a QA/QC system in a nuclear analytical laboratory to take appropriate action to start and complete the necessary steps for a successful quality system for ultimate national accreditation. This guidebook contributes to a better understanding of the basic ideas behind ISO/IEC 17025, the international standard for 'General requirements for the competence of testing and calibration laboratories'. It provides basic information and detailed explanation about the establishment of the QC system in analytical and nuclear analytical laboratories. It is a proper training material for training of trainers and makes managers with QC management and implementation familiar. This training material aims to facilitate the implementation of internationally accepted quality principles and to promote attempts by Member States' laboratories to obtain accreditation for nuclear analytical

  7. Number series of atoms, interatomic bonds and interface bonds defining zinc-blende nanocrystals as function of size, shape and surface orientation: Analytic tools to interpret solid state spectroscopy data

    Directory of Open Access Journals (Sweden)

    Dirk König

    2016-08-01

    Full Text Available Semiconductor nanocrystals (NCs experience stress and charge transfer by embedding materials or ligands and impurity atoms. In return, the environment of NCs experiences a NC stress response which may lead to matrix deformation and propagated strain. Up to now, there is no universal gauge to evaluate the stress impact on NCs and their response as a function of NC size dNC. I deduce geometrical number series as analytical tools to obtain the number of NC atoms NNC(dNC[i], bonds between NC atoms Nbnd(dNC[i] and interface bonds NIF(dNC[i] for seven high symmetry zinc-blende (zb NCs with low-index faceting: {001} cubes, {111} octahedra, {110} dodecahedra, {001}-{111} pyramids, {111} tetrahedra, {111}-{001} quatrodecahedra and {001}-{111} quadrodecahedra. The fundamental insights into NC structures revealed here allow for major advancements in data interpretation and understanding of zb- and diamond-lattice based nanomaterials. The analytical number series can serve as a standard procedure for stress evaluation in solid state spectroscopy due to their deterministic nature, easy use and general applicability over a wide range of spectroscopy methods as well as NC sizes, forms and materials.

  8. Number series of atoms, interatomic bonds and interface bonds defining zinc-blende nanocrystals as function of size, shape and surface orientation: Analytic tools to interpret solid state spectroscopy data

    Energy Technology Data Exchange (ETDEWEB)

    König, Dirk, E-mail: dirk.koenig@unsw.edu.au [Integrated Materials Design Centre (IMDC) and School of Photovoltaic and Renewable Energy Engineering (SPREE), University of New South Wales, Sydney (Australia)

    2016-08-15

    Semiconductor nanocrystals (NCs) experience stress and charge transfer by embedding materials or ligands and impurity atoms. In return, the environment of NCs experiences a NC stress response which may lead to matrix deformation and propagated strain. Up to now, there is no universal gauge to evaluate the stress impact on NCs and their response as a function of NC size d{sub NC}. I deduce geometrical number series as analytical tools to obtain the number of NC atoms N{sub NC}(d{sub NC}[i]), bonds between NC atoms N{sub bnd}(d{sub NC}[i]) and interface bonds N{sub IF}(d{sub NC}[i]) for seven high symmetry zinc-blende (zb) NCs with low-index faceting: {001} cubes, {111} octahedra, {110} dodecahedra, {001}-{111} pyramids, {111} tetrahedra, {111}-{001} quatrodecahedra and {001}-{111} quadrodecahedra. The fundamental insights into NC structures revealed here allow for major advancements in data interpretation and understanding of zb- and diamond-lattice based nanomaterials. The analytical number series can serve as a standard procedure for stress evaluation in solid state spectroscopy due to their deterministic nature, easy use and general applicability over a wide range of spectroscopy methods as well as NC sizes, forms and materials.

  9. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  10. Near-infrared Spectroscopy as a Process Analytical Technology Tool for Monitoring the Parching Process of Traditional Chinese Medicine Based on Two Kinds of Chemical Indicators.

    Science.gov (United States)

    Li, Kaiyue; Wang, Weiying; Liu, Yanping; Jiang, Su; Huang, Guo; Ye, Liming

    2017-01-01

    .(hydroxymethyl) furfural contents and 420.nm absorbance as reference values, respectively, which were main indicator components during parching process of most TCMThe established NIR models of three TCMs had low root mean square errors of prediction and high correlation coefficientsThe NIR method has great promise for use in TCM industrial manufacturing processes for rapid online analysis and quality control. Abbreviations used: NIR: Near-infrared Spectroscopy; TCM: Traditional Chinese medicine; Areca: Areca catechu L.; Hawthorn: Crataegus pinnatifida Bge.; Malt: Hordeum vulgare L.; 5-HMF: 5-(hydroxymethyl) furfural; PLS: Partial least squares; D: Dimension faction; SLS: Straight line subtraction, MSC: Multiplicative scatter correction; VN: Vector normalization; RMSECV: Root mean square errors of cross-validation; RMSEP: Root mean square errors of validation; R cal : Correlation coefficients; RPD: Residual predictive deviation; PAT: Process analytical technology; FDA: Food and Drug Administration; ICH: International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use.

  11. Analytical chemistry of actinides

    International Nuclear Information System (INIS)

    Chollet, H.; Marty, P.

    2001-01-01

    Different characterization methods specifically applied to the actinides are presented in this review such as ICP/OES (inductively coupled plasma-optical emission spectrometry), ICP/MS (inductively coupled plasma spectroscopy-mass spectrometry), TIMS (thermal ionization-mass spectrometry) and GD/OES (flow discharge optical emission). Molecular absorption spectrometry and capillary electrophoresis are also available to complete the excellent range of analytical tools at our disposal. (authors)

  12. Energy-economy models and energy efficiency policy evaluation for the household sector. An analysis of modelling tools and analytical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Mundaca, Luis; Neij, Lena

    2009-10-15

    -economy models, empirical literature shows that a larger variety of determinants need to be taken into account when analysing the process of adoption of efficient technologies. We then focus on the analysis of more than twenty case studies addressing the application of the reviewed modelling methodologies to the field of residential energy efficiency policy. Regarding policy instruments being evaluated, the majority of the cases focus on regulatory aspects, such as minimum performance standards and building codes. For the rest, evaluations focus on economically-driven policy instruments. The dominance of economic and engineering determinants for technology choice gives little room for the representation of informative policy instruments. In all cases, policy instruments are represented through technical factors and costs of measures for energy efficiency improvements. In addition, policy instruments tend to be modelled in an idealistic or oversimplified manner. The traditional but narrow single-criterion evaluation approach based on cost-effectiveness seems to dominate the limited number of evaluation studies. However, this criterion is inappropriate to comprehensively address the attributes of policy instruments and the institutional and market conditions in which they work. We then turn to identifying research areas that have the potential to further advance modelling tools. We first discuss modelling issues as such, including the importance of transparent modelling efforts; the explicit elaboration of methodologies to represent policies; the need to better translate modelling results into a set of concrete policy recommendations; and the use of complementary research methods to better comprehend the broad effects and attributes of policy instruments. Secondly, we approach techno-economic and environmental components of models. We discuss the integration of co-benefits as a key research element of modelling studies; the introduction of transaction costs to further improve the

  13. Analogies in Medicine: Valuable for Learning, Reasoning, Remembering and Naming

    Science.gov (United States)

    Pena, Gil Patrus; Andrade-Filho, Jose de Souza

    2010-01-01

    Analogies are important tools in human reasoning and learning, for resolving problems and providing arguments, and are extensively used in medicine. Analogy and similarity involve a structural alignment or mapping between domains. This cognitive mechanism can be used to make inferences and learn new abstractions. Through analogies, we try to…

  14. Investigation of a valuable biochemical indicator in radiotherapy. Pt. 2

    International Nuclear Information System (INIS)

    Horvath, M.; Geszti, O.; Benedek, E.; Farkas, J.; Reischl, G.

    1979-01-01

    The hemoglobin level of blood plasma is a sensitive tool in measuring radiation effects. Its value has increased in most of the cases during radiotherapy of cancer patients even applied at low doses. Haptoglobins behave similarly, the timedependent changes during radiotherapy are almost identical at both parameters. (orig.) [de

  15. Technologies for Extracting Valuable Metals and Compounds from Geothermal Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, Stephen [SIMBOL Materials

    2014-04-30

    Executive Summary Simbol Materials studied various methods of extracting valuable minerals from geothermal brines in the Imperial Valley of California, focusing on the extraction of lithium, manganese, zinc and potassium. New methods were explored for managing the potential impact of silica fouling on mineral extraction equipment, and for converting silica management by-products into commercial products.` Studies at the laboratory and bench scale focused on manganese, zinc and potassium extraction and the conversion of silica management by-products into valuable commercial products. The processes for extracting lithium and producing lithium carbonate and lithium hydroxide products were developed at the laboratory scale and scaled up to pilot-scale. Several sorbents designed to extract lithium as lithium chloride from geothermal brine were developed at the laboratory scale and subsequently scaled-up for testing in the lithium extraction pilot plant. Lithium The results of the lithium studies generated the confidence for Simbol to scale its process to commercial operation. The key steps of the process were demonstrated during its development at pilot scale: 1. Silica management. 2. Lithium extraction. 3. Purification. 4. Concentration. 5. Conversion into lithium hydroxide and lithium carbonate products. Results show that greater than 95% of the lithium can be extracted from geothermal brine as lithium chloride, and that the chemical yield in converting lithium chloride to lithium hydroxide and lithium carbonate products is greater than 90%. The product purity produced from the process is consistent with battery grade lithium carbonate and lithium hydroxide. Manganese and zinc Processes for the extraction of zinc and manganese from geothermal brine were developed. It was shown that they could be converted into zinc metal and electrolytic manganese dioxide after purification. These processes were evaluated for their economic potential, and at the present time Simbol

  16. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jae Seong

    1993-02-15

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  17. Analytical chemistry

    International Nuclear Information System (INIS)

    Choi, Jae Seong

    1993-02-01

    This book is comprised of nineteen chapters, which describes introduction of analytical chemistry, experimental error and statistics, chemistry equilibrium and solubility, gravimetric analysis with mechanism of precipitation, range and calculation of the result, volume analysis on general principle, sedimentation method on types and titration curve, acid base balance, acid base titration curve, complex and firing reaction, introduction of chemical electro analysis, acid-base titration curve, electrode and potentiometry, electrolysis and conductometry, voltammetry and polarographic spectrophotometry, atomic spectrometry, solvent extraction, chromatograph and experiments.

  18. Analytical chemistry

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    The division for Analytical Chemistry continued to try and develope an accurate method for the separation of trace amounts from mixtures which, contain various other elements. Ion exchange chromatography is of special importance in this regard. New separation techniques were tried on certain trace amounts in South African standard rock materials and special ceramics. Methods were also tested for the separation of carrier-free radioisotopes from irradiated cyclotron discs

  19. Conversion of waste polystyrene through catalytic degradation into valuable products

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Jasmin; Jan, Muhammad Rasul; Adnan [University of Peshawar, Peshawar (Pakistan)

    2014-08-15

    Waste expanded polystyrene (EPS) represents a source of valuable chemical products like styrene and other aromatics. The catalytic degradation was carried out in a batch reactor with a mixture of polystyrene (PS) and catalyst at 450 .deg. C for 30 min in case of Mg and at 400 .deg. C for 2 h both for MgO and MgCO{sub 3} catalysts. At optimum degradation conditions, EPS was degraded into 82.20±3.80 wt%, 91.60±0.20 wt% and 81.80±0.53 wt% liquid with Mg, MgO and MgCO{sub 3} catalysts, respectively. The liquid products obtained were separated into different fractions by fractional distillation. The liquid fractions obtained with three catalysts were compared, and characterized using GC-MS. Maximum conversion of EPS into styrene monomer (66.6 wt%) was achieved with Mg catalyst, and an increase in selectivity of compounds was also observed. The major fraction at 145 .deg. C showed the properties of styrene monomer. The results showed that among the catalysts used, Mg was found to be the most effective catalyst for selective conversion into styrene monomer as value added product.

  20. GC Analyses of Salvia Seeds as Valuable Essential Oil Source

    Directory of Open Access Journals (Sweden)

    Mouna Ben Taârit

    2014-01-01

    Full Text Available The essential oils of seeds of Salvia verbenaca, Salvia officinalis, and Salvia sclarea were obtained by hydrodistillation and analyzed by gas chromatography (GC and GC-mass spectrometry. The oil yields (w/w were 0.050, 0.047, and 0.045% in S. verbenaca, S. sclarea, and S. officinalis, respectively. Seventy-five compounds were identified. The essential oil composition of S. verbenaca seeds showed that over 57% of the detected compounds were oxygenated monoterpenes followed by sesquiterpenes (24.04% and labdane type diterpenes (5.61%. The main essential oil constituents were camphor (38.94%, caryophyllene oxide (7.28%, and 13-epi-manool (5.61%, while those of essential oil of S. officinalis were α-thujone (14.77%, camphor (13.08%, and 1,8-cineole (6.66%. In samples of S. sclarea, essential oil consists mainly of linalool (24.25%, α-thujene (7.48%, linalyl acetate (6.90%, germacrene-D (5.88%, bicyclogermacrene (4.29%, and α-copaene (4.08%. This variability leads to a large range of naturally occurring volatile compounds with valuable industrial and pharmaceutical outlets.

  1. Sea Buckthorn Oil—A Valuable Source for Cosmeceuticals

    Directory of Open Access Journals (Sweden)

    Marijana Koskovac

    2017-10-01

    Full Text Available Sea buckthorn (Hippophae rhamnoides L., Elaeagnaceae. is a thorny shrub that has small, yellow to dark orange, soft, juicy berries. Due to hydrophilic and lipophilic ingredients, berries have been used as food and medicine. Sea buckthorn (SB oil derived from berries is a source of valuable ingredients for cosmeceuticals. The unique combination of SB oil ingredients, in qualitative and quantitative aspects, provides multiple benefits of SB oil for internal and external use. Externally, SB oil can be applied in both healthy and damaged skin (burns or skin damage of different etiology, as it has good wound healing properties. Due to the well-balanced content of fatty acids, carotenoids, and vitamins, SB oil may be incorporated in cosmeceuticals for dry, flaky, burned, irritated, or rapidly ageing skin. There have been more than 100 ingredients identified in SB oil, some of which are rare in the plant kingdom (e.g., the ratio of palmitoleic to γ-linolenic acid. This review discusses facts related to the origin and properties of SB oil that make it suitable for cosmeceutical formulation.

  2. VALUABLE AND ORIENTATION FOUNDATIONS OF EDUCATIONAL SYSTEM OF THE COUNTRY

    Directory of Open Access Journals (Sweden)

    Vladimir I. Zagvyazinsky

    2016-01-01

    Full Text Available The aim of the investigation is to show that in modern market conditions it is necessary to keep humanistic valuable and orientation installations of domestic education and not to allow its slipping on a line item of utilitarian, quickly achievable, but not long-term benefits. Theoretical significance. The author emphasizes value of forming of an ideal – harmonious development of the personality – and the collectivist beginnings for disclosure of potential of each school student, a student, a worker, a specialist; also the author emphasizes on requirement of the stimulating, but not strictly regulated management of education. It is proved that copying of the western model of consecutive individualization of education without preserving the collectivist beginning is unacceptable in training, especially in educational process. In more general, strategic foreshortening this means that parity of the problem resolution of economy and the social sphere with which it is impossible to cope without support and educational development and first of all education, it is especially important during the periods of economic crises and stagnation for providing an exit from a crisis state on the basis of the advancing preparation and rational use of the personnel which neatly are considered as a human capital. Practical significance. Resources and positive tendencies in a development of education, especially elite, and also educational systems of some territories, including the Tyumen region where traditions of the enthusiasts-pioneers mastering the remote territories of oil and gas fields remain are shown. 

  3. Rabeto plus: a valuable drug for managing functional dyspepsia.

    Science.gov (United States)

    Ghosh, Asim; Halder, Susanta; Mandal, Sanjoy; Mandal, Arpan; Basu, Mitali; Dabholkar, Pareen

    2008-11-01

    The aim of the study was to evaluate and document the efficacy and tolerability of rabeto plus (FDC of rabeprazole and itopride) in management of functional dyspepsia. It was an open, prospective, non-comparative, multidose study. The patients with functional dyspepsia (NERD or non-erosive reflux disease) attending OPD of a leading, tertiary care, teaching hospital in West Bengal (BS Medical College, Bankura) were inducted in the study. A total of 46 adult patients of either sex with functional dyspepsia and a clinical diagnosis of NERD were given 1 capsule of rabeto plus before breakfast, for up to 4 weeks. Primary efficacy variables were relief from symptoms of heartburn, nausea, vomiting, waterbrash and fullness. Secondary efficacy variables were global assessment of efficacy and toleration by patients and treating physicians. The tolerability was assessed on the basis of record of spontaneously reported adverse events with their nature, intensity and outcome. Out of 55 patients enrolled in the study, 46 completed the study as planned, while 9 patients were lost to follow-up (dropped). Most patients reported near total symptom relief by the end of study. Total symptom score showed remarkable and significant improvement from baseline to end of the study. Importantly, none of the patients reported any side-effect. All participants tolerated the drug well. Moreover, response to study drug was rated as excellent or good by over 93% patients and their treating physicians. This means that 9 out 10 patients receiving rabeto plus reported desired symptom relief from dyspepsia. Thus it was concluded that rabeto plus is a valuable drug for treatment of functional dyspepsia or NERD.

  4. Whey-derived valuable products obtained by microbial fermentation.

    Science.gov (United States)

    Pescuma, Micaela; de Valdez, Graciela Font; Mozzi, Fernanda

    2015-08-01

    Whey, the main by-product of the cheese industry, is considered as an important pollutant due to its high chemical and biological oxygen demand. Whey, often considered as waste, has high nutritional value and can be used to obtain value-added products, although some of them need expensive enzymatic synthesis. An economical alternative to transform whey into valuable products is through bacterial or yeast fermentations and by accumulation during algae growth. Fermentative processes can be applied either to produce individual compounds or to formulate new foods and beverages. In the first case, a considerable amount of research has been directed to obtain biofuels able to replace those derived from petrol. In addition, the possibility of replacing petrol-derived plastics by biodegradable polymers synthesized during bacterial fermentation of whey has been sought. Further, the ability of different organisms to produce metabolites commonly used in the food and pharmaceutical industries (i.e., lactic acid, lactobionic acid, polysaccharides, etc.) using whey as growth substrate has been studied. On the other hand, new low-cost functional whey-based foods and beverages leveraging the high nutritional quality of whey have been formulated, highlighting the health-promoting effects of fermented whey-derived products. This review aims to gather the multiple uses of whey as sustainable raw material for the production of individual compounds, foods, and beverages by microbial fermentation. This is the first work to give an overview on the microbial transformation of whey as raw material into a large repertoire of industrially relevant foods and products.

  5. Recycled Cell Phones - A Treasure Trove of Valuable Metals

    Science.gov (United States)

    Sullivan, Daniel E.

    2006-01-01

    This U.S. Geological Survey (USGS) Fact Sheet examines the potential value of recycling the metals found in obsolete cell phones. Cell phones seem ubiquitous in the United States and commonplace throughout most of the world. There were approximately 1 billion cell phones in use worldwide in 2002. In the United States, the number of cell phone subscribers increased from 340,000 in 1985 to 180 million in 2004. Worldwide, cell phone sales have increased from slightly more than 100 million units per year in 1997 to an estimated 779 million units per year in 2005. Cell phone sales are projected to exceed 1 billion units per year in 2009, with an estimated 2.6 billion cell phones in use by the end of that year. The U.S. Environmental Protection Agency estimated that, by 2005, as many as 130 million cell phones would be retired annually in the United States. The nonprofit organization INFORM, Inc., anticipated that, by 2005, a total of 500 million obsolete cell phones would have accumulated in consumers' desk drawers, store rooms, or other storage, awaiting disposal. Typically, cell phones are used for only 1 1/2 years before being replaced. Less than 1 percent of the millions of cell phones retired and discarded annually are recycled. When large numbers of cell phones become obsolete, large quantities of valuable metals end up either in storage or in landfills. The amount of metals potentially recoverable would make a significant addition to total metals recovered from recycling in the United States and would supplement virgin metals derived from mining.

  6. Ground subsidence information as a valuable layer in GIS analysis

    Science.gov (United States)

    Murdzek, Radosław; Malik, Hubert; Leśniak, Andrzej

    2018-04-01

    Among the technologies used to improve functioning of local governments the geographic information systems (GIS) are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR) method.

  7. Ground subsidence information as a valuable layer in GIS analysis

    Directory of Open Access Journals (Sweden)

    Murdzek Radosław

    2018-01-01

    Full Text Available Among the technologies used to improve functioning of local governments the geographic information systems (GIS are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR method.

  8. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  9. FUMAC-84. A hybrid PCI analytical tool

    International Nuclear Information System (INIS)

    Matheson, J.E.; Walton, L.A.

    1984-01-01

    ''FUMAC-84'', a new computer code currently under development at Babcock and Wilcox, will be used to analyze PCMI in light water reactor fuel rods. This is a hybrid code in the sense that the pellet behaviour is predicted from deterministic models which incorporate the large data base being generated by the international fuel performance programs (OVERRAMP, SUPER-RAMP, NFIR, etc.), while the cladding is modelled using finite elements. The fuel cracking and relocation model developed for FUMAC is semi-empirical and includes data up to 35 GWd/mtU and linear heat rates ranging from 100 to 700 W/Cm. With this model the onset of cladding ridging has been accurately predicted for steady-state operation. Transient behaviour of the pellet is still under investigation and the model is being enhanced to include these effects. The cladding model integrates the mechanical damage over a power history by solving the finite element assumed displacement problem in a quasistatic manner. Early work on FUMAC-84 has been directed at the development and benchmarking of the interim code. The purpose of the interim code is to provide a vehicle to proof out the deterministic pellet models which have been developed. To date the cracking model and the relocation model have been benchmarked. The thermal model for the pellet was developed by fitting data from several Halden experiments. The ability to accurately predict cladding ridging behaviour has been used to test how well the pellet swelling, densification and compliance models work in conjunction with fuel cladding material models. Reasonable results have been achieved for the steady-state cases while difficulty has been encountered in trying to reproduce transient results. Current work includes an effort to improve the ability of the models to handle transients well. (author)

  10. Business intelligence guidebook from data integration to analytics

    CERN Document Server

    Sherman, Rick

    2015-01-01

    Between the high-level concepts of business intelligence and the nitty-gritty instructions for using vendors’ tools lies the essential, yet poorly-understood layer of architecture, design and process. Without this knowledge, Big Data is belittled – projects flounder, are late and go over budget. Business Intelligence Guidebook: From Data Integration to Analytics shines a bright light on an often neglected topic, arming you with the knowledge you need to design rock-solid business intelligence and data integration processes. Practicing consultant and adjunct BI professor Rick Sherman takes the guesswork out of creating systems that are cost-effective, reusable and essential for transforming raw data into valuable information for business decision-makers. After reading this book, you will be able to design the overall architecture for functioning business intelligence systems with the supporting data warehousing and data-integration applications. You will have the information you need to get a project laun...

  11. Analytical mechanics

    CERN Document Server

    Helrich, Carl S

    2017-01-01

    This advanced undergraduate textbook begins with the Lagrangian formulation of Analytical Mechanics and then passes directly to the Hamiltonian formulation and the canonical equations, with constraints incorporated through Lagrange multipliers. Hamilton's Principle and the canonical equations remain the basis of the remainder of the text. Topics considered for applications include small oscillations, motion in electric and magnetic fields, and rigid body dynamics. The Hamilton-Jacobi approach is developed with special attention to the canonical transformation in order to provide a smooth and logical transition into the study of complex and chaotic systems. Finally the text has a careful treatment of relativistic mechanics and the requirement of Lorentz invariance. The text is enriched with an outline of the history of mechanics, which particularly outlines the importance of the work of Euler, Lagrange, Hamilton and Jacobi. Numerous exercises with solutions support the exceptionally clear and concise treatment...

  12. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  13. Catalytic conversion of CO2 into valuable products

    International Nuclear Information System (INIS)

    Pham-Huu, C.; Ledoux, M.J.

    2008-01-01

    inertness the recovery of the active phase is extremely easy, i.e. acidic or basic washing, which reduce the cost investment of the process for the final spent catalyst disposal and the fully re-use of the support. The high thermal conductivity of the SiC support could also allow the reduction of the temperature loss during the reaction taken into account the high endothermicity of the reaction. The aim of the presentation is to report the synthesis and use of SiC-based catalyst for CO 2 reforming which allows the conversion of CO 2 into a more valuable products for further fuel processing via the Fischer-Tropsch synthesis

  14. Microbial Leaching of Some Valuable Elements From Egyptian Phosphate Rock

    International Nuclear Information System (INIS)

    Kamal, H.M.; Hassanein, R.A.; Mahdy, H.M.A.; Mahmoud, K.F.; Abouzeid, M.A.

    2012-01-01

    Four phosphate rock samples representing different phosphate mineralization modes in Egypt were selected from Abu Tartar, Nile valley and Red sea areas. Factors affecting the phosphate rock solubilization and some of the contained valuable elements by Aspergillus niger, Penicillium sp. and Pseudomonas fluorescence, were studied with especial orientation towards the completion of phosphate rock samples solubilization especially die low grade one. Effect of nitrogen source type on leaching efficiency by Aspergillus niger when two nitrogen sources on the phosphate bioleaching efficiency, it is clear that the ammonium chloride is more favorable as nitrogen source than sodium nitrate in the bioleaching of phosphate rocks. When Aspergillus niger was applied under die following conditions: 50 g/1 of sucrose as a carbon source, 0.1 N of ammonium chloride as a nitrogen source, 10 days incubation period, 0.5% solid: liquid ratio for P 2 O 5 and 5% for U and REE and - 270 mesh of grain size. The optimum leaching of P 2 O 5 , U and REE from phosphate rock samples reached (23.27%, 17.4%, 11.4%, respectively), while at -60 mesh they reached to 16.58%, 28.9%, 30.2% respectively. The optimum conditions for the maximal leaching efficiencies of P 2 O 5 , U and REE when applying the Penicillium sp. from the phosphate rock samples were: 100 g/1 of sucrose as a carbon source for P 2 O 5 and U and 10 g/1 for REE, 7,15 and 10 days incubation period for P 2 O 5 , U and REE, respectively, 0.5% solid: liquid ratio for P 2 O 5 and 5% for U and REE. Finally, the application of phosphate rock samples grinded to -270 mesh of grain size for P 2 O 5 and (-60 to -140) for U and REE. The studied leaching efficiency of P 2 O 5 , U and REE gave at -270 mesh 33.66%, 24.3%, 15.9% respectively, while at -60 mesh they gave 33.76%, 26.7%, 17.8% and at -140 mesh gave 31.32%, 27.9%, 17.6%, respectively.The optimum conditions for the P 2 O 5 leaching efficiency when applying the Pseudomonas fluorescence were

  15. Analytic number theory an introductory course

    CERN Document Server

    Bateman, Paul T

    2004-01-01

    This valuable book focuses on a collection of powerful methods ofanalysis that yield deep number-theoretical estimates. Particularattention is given to counting functions of prime numbers andmultiplicative arithmetic functions. Both real variable ("elementary")and complex variable ("analytic") methods are employed.

  16. Big Data Analytics for Industrial Process Control

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schioler, Henrik; Kulahci, Murat

    2017-01-01

    Today, in modern factories, each step in manufacturing produces a bulk of valuable as well as highly precise information. This provides a great opportunity for understanding the hidden statistical dependencies in the process. Systematic analysis and utilization of advanced analytical methods can...

  17. Correlated Raman micro-spectroscopy and scanning electron microscopy analyses of flame retardants in environmental samples: a micro-analytical tool for probing chemical composition, origin and spatial distribution.

    Science.gov (United States)

    Ghosal, Sutapa; Wagner, Jeff

    2013-07-07

    We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.

  18. Application of process analytical technology for monitoring freeze-drying of an amorphous protein formulation: use of complementary tools for real-time product temperature measurements and endpoint detection.

    Science.gov (United States)

    Schneid, Stefan C; Johnson, Robert E; Lewis, Lavinia M; Stärtzel, Peter; Gieseler, Henning

    2015-05-01

    Process analytical technology (PAT) and quality by design have gained importance in all areas of pharmaceutical development and manufacturing. One important method for monitoring of critical product attributes and process optimization in laboratory scale freeze-drying is manometric temperature measurement (MTM). A drawback of this innovative technology is that problems are encountered when processing high-concentrated amorphous materials, particularly protein formulations. In this study, a model solution of bovine serum albumin and sucrose was lyophilized at both conservative and aggressive primary drying conditions. Different temperature sensors were employed to monitor product temperatures. The residual moisture content at primary drying endpoints as indicated by temperature sensors and batch PAT methods was quantified from extracted sample vials. The data from temperature probes were then used to recalculate critical product parameters, and the results were compared with MTM data. The drying endpoints indicated by the temperature sensors were not suitable for endpoint indication, in contrast to the batch methods endpoints. The accuracy of MTM Pice data was found to be influenced by water reabsorption. Recalculation of Rp and Pice values based on data from temperature sensors and weighed vials was possible. Overall, extensive information about critical product parameters could be obtained using data from complementary PAT tools. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  19. Analytical chemistry experiment

    International Nuclear Information System (INIS)

    Park, Seung Jo; Paeng, Seong Gwan; Jang, Cheol Hyeon

    1992-08-01

    This book deals with analytical chemistry experiment with eight chapters. It explains general matters that require attention on experiment, handling of medicine with keep and class, the method for handling and glass devices, general control during experiment on heating, cooling, filtering, distillation and extraction and evaporation and dry, glass craft on purpose of the craft, how to cut glass tube and how to bend glass tube, volumetric analysis on neutralization titration and precipitation titration, gravimetric analysis on solubility product, filter and washing and microorganism experiment with necessary tool, sterilization disinfection incubation and appendixes.

  20. Technical session: the Atomika TXRF tool series

    International Nuclear Information System (INIS)

    Dobler, M. . URL: www.atomika.com

    2000-01-01

    ATOMIKA Instruments GmbH holds worldwide competence as a renowned producer of high-performance metrology tools and analytic devices. ATOMIKA's TXRF products are widely accepted for elemental contamination monitoring on semiconductor materials as well as in chemical analysis. More than 100 companies and institutes have their analytical work based on TXRF tools made by ATOMIKA Instruments. ATOMIKA's TXRF 8300W/82OOW wafer contamination monitors are the result of an evolution based on a background of 20 years of competence. Built for the semiconductor industry, the TXRF 8300W/82OOW detect rnetal contaminants on 300mm, or 200mm silicon wafer surfaces with highest possible sensitivity. Operating under ambient conditions, with a sealed x-ray tube, and having their own minienvironment (FOUP, or SMIF respectively), TXRF 8300W182OOW are optimally suited for in-line use. Fab automation (GEM/SECS) is supported by predefined measurement recipes and fully automatic routines. High throughput and uptimes, an ergonomic design according to SEMI standard plus an unrivaled small footprint of 1.1 m 2 make the TXRF 8300W/82OOW most efficient and economic solutions for industrial wafer monitoring. As the specific tool for multielement trace and thin layer analysis the ATOMIKA TXRF 8030C provides simultaneous and fast determination of alt elements within the range from sodium to uranium. Sophisticated measurement instrumentation provides detection limits down to the ppt range. On the other hand, performance is decisively facilitated by features as automatic switching of primary radiation, predefined measurement recipes, or software driven optimization of the entire measurement process. These features make the TXRF 8030C a valuable analytic tool for a wide range of applications: contamination in water, dust or sediments; quantitative screening in the chemical industry; toxic elements in tissues and biological fluids; radioactive elements; process chemicals in the semiconductor industry

  1. Search Analytics for Your Site

    CERN Document Server

    Rosenfeld, Louis

    2011-01-01

    Any organization that has a searchable web site or intranet is sitting on top of hugely valuable and usually under-exploited data: logs that capture what users are searching for, how often each query was searched, and how many results each query retrieved. Search queries are gold: they are real data that show us exactly what users are searching for in their own words. This book shows you how to use search analytics to carry on a conversation with your customers: listen to and understand their needs, and improve your content, navigation and search performance to meet those needs.

  2. A REVIEW ON PREDICTIVE ANALYTICS IN DATA MINING

    OpenAIRE

    Arumugam.S

    2016-01-01

    The data mining its main process is to collect, extract and store the valuable information and now-a-days it’s done by many enterprises actively. In advanced analytics, Predictive analytics is the one of the branch which is mainly used to make predictions about future events which are unknown. Predictive analytics which uses various techniques from machine learning, statistics, data mining, modeling, and artificial intelligence for analyzing the current data and to make predictions about futu...

  3. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  4. Google Analytics – Index of Resources

    Science.gov (United States)

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  5. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    Science.gov (United States)

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  6. Demonstrating Success: Web Analytics and Continuous Improvement

    Science.gov (United States)

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  7. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  8. Valuable Virality

    NARCIS (Netherlands)

    Akpinar, E.; Berger, Jonah

    2017-01-01

    Given recent interest in social media, many brands now create content that they hope consumers will view and share with peers. While some campaigns indeed go “viral,” their value to the brand is limited if they do not boost brand evaluation or increase purchase. Consequently, a key question is how

  9. Valuable Connections

    DEFF Research Database (Denmark)

    Kjærsgaard, Mette Gislev; Smith, Rachel Charlotte

    2014-01-01

    and blurred boundaries between physical, digital and hybrid contexts, as well as design, production and use, we might need to rethink the role of ethnography within design and business development. Perhaps the aim is less about ”getting closer” to user needs and real-life contexts, through familiarization......, mediation, advocacy and facilitation, as in conventional approaches to ethnography in user centred design, and more about creating a critical theoretically informed distance from which to perceive and reflect upon complex interconnections between people, technology, business and design, as well as our roles...

  10. Valuable Connections

    DEFF Research Database (Denmark)

    Kjærsgaard, Mette Gislev; Smith, Rachel Charlotte

    2014-01-01

    , as well as design, production and use, we might need to rethink the role of ethnography within user centred design and business development. Here the challenge is less about ”getting closer” to user needs and real-life contexts, through familiarization, mediation, and facilitation, and more about creating...... a critical theoretically informed distance from which to perceive and reflect upon complex interconnections between people, technology, business and design, as well as our roles as researchers and designers within these....

  11. Line lessons: Enbridge's Northern Line provides valuable information

    Energy Technology Data Exchange (ETDEWEB)

    Ross, E.

    2000-02-01

    Experiences gained from the 14-year old Norman Wells crude oil pipeline in the Northwest Territories may provide operators with valuable insights in natural gas pipeline developments in northern Canada. The Norman Wells line is the first and only long-distance pipeline in North America buried in permafrost and has proven to be a veritable laboratory on pipeline behaviour in extremely cold climates which also happen to be discontinuous at the same time. The line was built by Enbridge with a 'limit state' design, i e. it was built to move within the permafrost within certain limits, the amount of movement depending upon the area in which the line was built. This technology, which is still cutting edge, allows the pipeline to react to the freeze-thaw cycle without being affected by the heaving and resettling. The knowledge gained from the Norman Wells Line has come in very useful in the more recent AltaGas Services project transporting natural gas from a nearby well into the the town of Inuvik. Enbridge also contributed to the development of various pipeline inspection tools such as the 'Geopig' which travels within the pipeline and can pinpoint the location of problems practically within a matter of inches, and the 'Rolligon' an amphibious vehicle with five-foot diameter rubber tires that displaces only two pounds per square inch, leaving barely a track as it travels along the right-of-way during times other than winter.

  12. Application of high-resolution melting analysis for authenticity testing of valuable Dendrobium commercial products.

    Science.gov (United States)

    Dong, Xiaoman; Jiang, Chao; Yuan, Yuan; Peng, Daiyin; Luo, Yuqin; Zhao, Yuyang; Huang, Luqi

    2018-01-01

    The accurate identification of botanical origin in commercial products is important to ensure food authenticity and safety for consumers. The Dendrobium species have long been commercialised as functional food supplements and herbal medicines in Asia. Three valuable Dendrobium species, namely Dendrobium officinale, D. huoshanense and D. moniliforme, are often mutually adulterated in trade products in pursuit of higher profit. In this paper, a rapid and reliable semi-quantitative method for identifying the botanical origin of Dendrobium products in terminal markets was developed using high-resolution melting (HRM) analysis with specific primer pairs to target the trnL-F region. The HRM analysis method detected amounts of D. moniliforme adulterants as low as 1% in D. huoshanense or D. officinale products. The results have demonstrated that HRM analysis is a fast and effective tool for the differentiation of these Dendrobium species both for their authenticity as well as for the semi-quantitative determination of the purity of their processed products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  13. Report on FY 1998 project for international energy utilization rationalization, etc. (Project of the analytical tool survey for making energy consumption effective in Asia); 1998 nendo chosa hokokusho. Kokusai energy shiyo gorika nado taisaku jigyo (Asia energy shohi koritsuka bunseki tool chosa jigyo)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    In the Asian region, for the purpose of grasping effects of both economic growth and effective energy consumption, an analytical tool added with a computational function was developed with the interindustrial relations table as database for the analysis of the present situation and the simulational analysis. In the analysis of the present situation, changes between the two time-points, 1985 and 1990, were analyzed using the developed energy consumption table and the CO2 emission table. The energy consumption amount is increasing with the growing economy. However, the energy unit consumption is decreasing in countries except Korea, the Philippines, Singapore and Malaysia. The trend is that the energy consumption is becoming effective and the CO2 emission amount is decreasing. As an example of simulating the introduction effect of energy saving technology, the simulation was conducted on the three: iron steel, paper/pulp, and cement in Indonesia. As a result, the decrease in energy consumption was shown in paper/pulp by 48%, in cement by 16%, and in iron/steel by 16%. The same was also indicated in CO2 emission. (NEDO)

  14. GBEP pilot Ghana. Very valuable and successful - a follow-up is suggested. Conclusions and recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Hanekamp, E.; Vissers, P.; De Lint, S. [Partners for Innovation, Amsterdam (Netherlands)

    2013-02-15

    The Global Bio-Energy Partnership (GBEP) has developed a set of 24 sustainability indicators applicable to all forms of bio-energy and aimed at voluntary use by national governments. The GBEP indicators enable governments to assess the bio-energy sector and to develop new policies related to sustainable bio-energy production and use. These indicators have been piloted in Ghana. Modern bio-energy is a big opportunity for the region, which is why NL Agency adopted and supported the pilot, together with the Global Bio-Energy Partnership (GBEP). The pilot project also was supported by the ECOWAS Regional Centre for Renewable Energy and Energy Efficiency (ECREEE) and has been coordinated by the Council for Scientific and Industrial Research (CSIR). The Ghana Energy Commission took the responsibility to involve policymakers. Partners for Innovation was commissioned by NL Agency to provide technical assistance for the pilot. The main aims of the project are: (a) Enhancing the capacity of the host country Ghana (and ECOWAS) to use the GBEP indicators as a tool for assessing the sustainability of its bio-energy sector and/or developing sustainable bio-energy policies; (b) Learning lessons on how to apply the indicators and how to enhance their practicality as a tool for policymakers and giving this as feedback to the GBEP community. Three Ghanaian research institutes (CSIR-FORIG, CSIR-IIR and UG-ISSER) have studied 11 out of the 24 GBEP indicators in the pilot. The pilot has been a success: the 24 sustainability criteria appear to be very valuable for Ghana. As such the indicators provide, also for other governments, a practical tool to assess sustainability of biomass sectors and policies. The report also shows important insights on data availability and quality, and on the applicability of the GBEP indicators in Ghana. The final report provides concrete recommendations on: (1) How Ghana can proceed with the GBEP sustainability indicators; and (2) The lessons learned for

  15. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  16. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  17. Big Data Analytics for Industrial Process Control

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schioler, Henrik; Kulahci, Murat

    2017-01-01

    Today, in modern factories, each step in manufacturing produces a bulk of valuable as well as highly precise information. This provides a great opportunity for understanding the hidden statistical dependencies in the process. Systematic analysis and utilization of advanced analytical methods can ...... lead towards more informed decisions. In this article we discuss some of the challenges related to big data analysis in manufacturing and relevant solutions to some of these challenges....

  18. Analytical benchmarks for nuclear engineering applications. Case studies in neutron transport theory

    International Nuclear Information System (INIS)

    2008-01-01

    The developers of computer codes involving neutron transport theory for nuclear engineering applications seldom apply analytical benchmarking strategies to ensure the quality of their programs. A major reason for this is the lack of analytical benchmarks and their documentation in the literature. The few such benchmarks that do exist are difficult to locate, as they are scattered throughout the neutron transport and radiative transfer literature. The motivation for this benchmark compendium, therefore, is to gather several analytical benchmarks appropriate for nuclear engineering applications under one cover. We consider the following three subject areas: neutron slowing down and thermalization without spatial dependence, one-dimensional neutron transport in infinite and finite media, and multidimensional neutron transport in a half-space and an infinite medium. Each benchmark is briefly described, followed by a detailed derivation of the analytical solution representation. Finally, a demonstration of the evaluation of the solution representation includes qualified numerical benchmark results. All accompanying computer codes are suitable for the PC computational environment and can serve as educational tools for courses in nuclear engineering. While this benchmark compilation does not contain all possible benchmarks, by any means, it does include some of the most prominent ones and should serve as a valuable reference. (author)

  19. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  20. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  1. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  2. Locally analytic vectors in representations of locally

    CERN Document Server

    Emerton, Matthew J

    2017-01-01

    The goal of this memoir is to provide the foundations for the locally analytic representation theory that is required in three of the author's other papers on this topic. In the course of writing those papers the author found it useful to adopt a particular point of view on locally analytic representation theory: namely, regarding a locally analytic representation as being the inductive limit of its subspaces of analytic vectors (of various "radii of analyticity"). The author uses the analysis of these subspaces as one of the basic tools in his study of such representations. Thus in this memoir he presents a development of locally analytic representation theory built around this point of view. The author has made a deliberate effort to keep the exposition reasonably self-contained and hopes that this will be of some benefit to the reader.

  3. Nonlinear optics an analytical approach

    CERN Document Server

    Mandel, Paul

    2010-01-01

    Based on the author's extensive teaching experience and lecture notes, this textbook provides a substantially analytical rather than descriptive presentation of nonlinear optics. Divided into five parts, with most chapters corresponding to a two-hour lecture, the book begins with a unique account of the historical development from Kirchhoff's law for the black-body radiation to Planck's quantum hypothesis and Einstein's discovery of spontaneous emission - providing all the explicit proofs. The subsequent sections deal with matter quantization, ultrashort pulse propagation in 2-level media, cavity nonlinear optics, chi(2) and chi(3) media. For graduate and PhD students in nonlinear optics or photonics, while also representing a valuable reference for researchers in these fields.

  4. Assessment of MultiLocus Sequence Analysis As a Valuable Tool for the Classification of the Genus Salinivibrio

    Directory of Open Access Journals (Sweden)

    Clara López-Hermoso

    2017-06-01

    Full Text Available The genus Salinivibrio includes obligatory halophilic bacteria and is commonly isolated from hypersaline habitats and salted food products. They grow optimally between 7.5 and 10% salts and are facultative anaerobes. Currently, this genus comprises four species, one of them, S. costicola, with three subspecies. In this study we isolated and characterized an additional 70 strains from solar salterns located in different locations. Comparative 16S rRNA gene sequence analysis identified these strains as belonging to the genus Salinivibrio but could not differentiate strains into species-like groups. To achieve finer phylogenetic resolution, we carried out a MultiLocus Sequence Analysis (MLSA of the new isolates and the type strains of the species of Salinivibrio based on the individual as well as concatenated sequences of four housekeeping genes: gyrB, recA, rpoA, and rpoD. The strains formed four clearly differentiated species-like clusters called phylogroups. All of the known type and subspecies strains were associated with one of these clusters except S. sharmensis. One phylogroup had no previously described species coupled to it. Further DNA–DNA hybridization (DDH experiments with selected representative strains from these phylogroups permitted us to validate the MLSA study, correlating the species level defined by the DDH (70% with a 97% cut-off for the concatenated MLSA gene sequences. Based on these criteria, the novel strains forming phylogroup 1 could constitute a new species while strains constructing the other three phylogroups are members of previously recognized Salinivibrio species. S. costicola subsp. vallismortis co-occurs with S. proteolyticus in phylogroup 4, and separately from other S. costicola strains, indicating its need for reclassification. On the other hand, genome fingerprinting analysis showed that the environmental strains do not form clonal populations and did not cluster according to their site of cultivation. In future studies regarding the classification and identification of new Salinivibrio strains we recommend the following strategy: (i initial partial sequencing of the 16S rRNA gene for genus-level identification; (ii sequencing and concatenation of the four before mentioned housekeeping genes for species-level discrimination; (iii DDH experiments, only required when the concatenated MLSA similarity values among a new isolate and other Salinivibrio strains are above the 97% cut-off.

  5. Helicobacter-negative gastritis: polymerase chain reaction for Helicobacter DNA is a valuable tool to elucidate the diagnosis.

    Science.gov (United States)

    Kiss, S; Zsikla, V; Frank, A; Willi, N; Cathomas, G

    2016-04-01

    Helicobacter-negative gastritis has been increasingly reported. Molecular techniques as the polymerase chain reaction (PCR) may detect bacterial DNA in histologically negative gastritis. To evaluate of Helicobacter PCR in gastric biopsies for the daily diagnostics of Helicobacter-negative gastritis. Over a 5-year period, routine biopsies with chronic gastritis reminiscent of Helicobacter infection, but negative by histology, were tested by using a H. pylori specific PCR. Subsequently, PCR-negative samples were re-evaluated using PCR for other Helicobacter species. Of the 9184 gastric biopsies, 339 (3.7%) with histological-negative gastritis and adequate material were forwarded to PCR analysis for H. pylori and 146 (43.1%) revealed a positive result. In 193 H. pylori DNA-negative biopsies, re-analysis using PCR primers for other Helicobacter species, revealed further 23 (11.9%) positive biopsies, including 4 (2.1%) biopsies with H. heilmannii sensu lato. PCR-positive biopsies showed a higher overall inflammatory score, more lymphoid follicles/aggregates and neutrophils (P gastritis. © 2016 John Wiley & Sons Ltd.

  6. [Dobutamine stress magnetic resonance imaging (DS-MRI), a valuable tool for the diagnosis of ischemic heart disease].

    Science.gov (United States)

    van Dijkman, P R M; Kuijpers, Th J A; Blom, B M; van Herpen, G

    2002-07-13

    Assessment of the clinical applicability of DS-MRI for the detection of myocardial ischemia and myocardial viability. Prospective. In the period from 1 November 1999 to 31 October 2000, patients with suspected coronary artery disease who could not be studied by means of conventional bicycle ergometry underwent breath-hold DS-MRI (1 Tesla) 4 days after cessation of anti-ischemic medication. Three left ventricular short-axis planes were examined for the occurrence of disorders in wall movement during infusion of increasing doses of dobutamine (10, 20, 30 and 40 micrograms/kg/min). Temporary recovery of wall thickening in a previously diminished or non-contracting segment under 5 micrograms/kg/min of dobutamine was considered proof of viability. Development of hypo-, a- or dyskinesia at higher doses of dobutamine was taken to indicate ischemia. If the DS-MRI test was positive for ischemia, coronary angiography was performed. If indicated, this was followed by revascularisation. If DS-MRI did not reveal ischemia, the patient was seen at the outpatient department. Of the 100 patients (62 men and 38 women with an average age of 62 years, SD = 12) subjected to DS-MRI, 95 yielded results that were suitable for diagnosis. Of the 42 patients with DS-MRI scans that were considered positive for ischemia and in whom coronary angiography was subsequently performed, 41 had such coronary abnormalities that revascularisation was indicated. One patient was false-positive. All 53 patients with non-ischemic DS-MRI scans were followed-up for 11-23 months (mean 17 months). One patient died suddenly 2 weeks after the MRI-test. The other 52 patients did not experience any coronary events nor sudden cardiac death. The predictive value of a positive DS-MRI scan for ischemia was 98% and the predictive value of a negative DS-MRI scan was also 98%. DS-MRI is a safe diagnostic method for the detection or exclusion of myocardial ischemia and viability in patients with suspected coronary artery disease.

  7. What can next generation sequencing do for you? Next generation sequencing as a valuable tool in plant research

    OpenAIRE

    Bräutigam, Andrea; Gowik, Udo

    2010-01-01

    Next generation sequencing (NGS) technologies have opened fascinating opportunities for the analysis of plants with and without a sequenced genome on a genomic scale. During the last few years, NGS methods have become widely available and cost effective. They can be applied to a wide variety of biological questions, from the sequencing of complete eukaryotic genomes and transcriptomes, to the genome-scale analysis of DNA-protein interactions. In this review, we focus on the use of NGS for pla...

  8. UV-laser treatment of nanodiamond seeds-a valuable tool for modification of nanocrystalline diamond films properties

    Czech Academy of Sciences Publication Activity Database

    Vlček, J.; Fitl, P.; Vrňata, M.; Fekete, Ladislav; Taylor, Andrew; Fendrych, František

    2013-01-01

    Roč. 46, č. 3 (2013), s. 1-7 ISSN 0022-3727 R&D Projects: GA ČR(CZ) GAP108/11/1298; GA MŠk(CZ) LD11076 EU Projects: European Commission(XE) 238201 - MATCON Institutional support: RVO:68378271 Keywords : Raman-spectroscopy * ablation * carbon Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.521, year: 2013

  9. WEB-QUESTS IN THE ENGLISH LANGUAGE STUDYING AND TEACHING AS A VALUABLE RESOURCE AND EFFECTIVE TOOL

    Directory of Open Access Journals (Sweden)

    K. M. Pererva

    2015-05-01

    Full Text Available Purpose. This paper is a study of innovative methods of learning and teaching English with the help of Internet resources and students motivation to seek the necessary information at homework. Methodology. The main principle of the Web-Quest as a type of English language teaching is to motivate students. For example, by participation in the Web-Quest students, who were unsure of their knowledge, become more confident. Having clear goals and objectives, using computer skills, motivated young people more actively acts as a confident user of English. Findings. According to the technology of We-Quests students were asked to create one or more projects directly related to the successful execution of the work. It is a significant result of all the hard work of students, and it is the subject of evaluation. Evaluation is an essential component of Web-Quest or any other project, and from this point of view, the criteria should be clear and accessible to students from the very beginning. These instructions can and should be changed in order to differentiate and provide an oral presentation and written work. Originality. Basically, Web-Quests are mini-projects in which a higher percentage of the material obtained from the Internet. They can be created by teachers or students, depending on the type of training work. The author detailed the increase of possibilities in the search of Internet projects with other creative types of student work. They may include: review of the literature, essay writing, discussion of read works and other. Practical value. The paper confirmed that the roles and tasks, reflecting the real world, invites to cooperate, stimulate and train the thinking process at a higher level. That is why the use of Web-Quests can improve the language skills of the educational process (reading for information extraction, detailed reading, negotiations, oral and written communication, and other.

  10. FORMALIZATION OF THE ACCOUNTING VALUABLE MEMES METHOD FOR THE PORTFOLIO OF ORGANIZATION DEVELOPMENT AND INFORMATION COMPUTER TOOLS FOR ITS IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Serhii D. Bushuiev

    2017-12-01

    Full Text Available The current state of project management has been steadily demonstrating a trend toward increasing the role of flexible "soft" management practices. A method for preparing solutions for the formation of a value-oriented portfolio based on a comparison of the level of internal organizational values is proposed. The method formalizes the methodological foundations of value-oriented portfolio management in the development of organizations in the form of approaches, basic terms and technological methods with ICT using, which makes it possible to use them as an integral knowledge system for creating an automated system for managing portfolios of organizations. The result of the study is the deepening of the theoretical provisions for managing the development of organizations through the implementation of a value-oriented portfolio of projects, which allowed formalize the method of recording value memes in the development portfolios of organizations, to disclose its logic, essence, objective basis and rules.

  11. Alvarado score: A valuable clinical tool for diagnosis of acute appendicitis –a retros-pective study

    Directory of Open Access Journals (Sweden)

    Swagata Brahmachari1 and Ashwini B. Jajee2

    2013-08-01

    Full Text Available Appendicitis is a common surgical emergency and diagnosis is still a great challenge. Accurate diagnosis and timely intervention re-duces morbidity and mortality. The present study was conducted to evaluate Alvarado scoring system for diagnosis of acute appen-dicitis in Indian set up. The study was carried out on 200 patients admitted in Surgery ward between January 2009 and December 2010 with right lower quadrant abdominal pain. Alvarado score was calculated and all patients were divided in three groups. Mean age of presentation was 29.12 years and male to female ratio was 1.27:1. Higher the Alvarado score, more is the sensitivity. So pa-tients having score 7 or above had sensitivity of 66%. We con-clude that Alvarado score is unique since it incorporates signs, symptoms and laboratory findings of suspicious patients. Alvarado score can be utilized safely for diagnosis of acute appendicitis.

  12. UV-laser treatment of nanodiamond seeds - a valuable tool for modification of nanocrystalline diamond films properties

    International Nuclear Information System (INIS)

    Vlček, J; Fitl, P; Vrňata, M; Fekete, L; Taylor, A; Fendrych, F

    2013-01-01

    This work aimed to study the UV-laser treatment of precursor (i.e. nanodiamond (ND) seeds on silicon substrates) and its influence on the properties of grown nanocrystalline diamond (NCD) films. Pulsed Nd:YAG laser operating at the fourth harmonic frequency (laser fluence E L = 250 mJ cm -2 , pulse duration 5 ns) was used as a source, equipped with an optical system for focusing laser beam onto the sample, allowing exposure of a local spot and horizontal patterning. The variable parameters were: number of pulses (from 5 to 400) and the working atmosphere (He, Ar and O 2 ). Ablation and/or graphitization of seeded nanodiamond particles were observed. Further the microwave plasma-enhanced chemical vapour deposition was employed to grow NCD films on exposed and non-exposed areas of silicon substrates. The size, shape and density distribution of laser-treated nanodiamond seeds were observed by atomic force microscopy (AFM) and their chemical composition by x-ray photoelectron spectroscopy (XPS) analysis. The resulting NCD films (uniform thickness of 400 nm) were characterized by: Raman spectroscopy to analyse occurrence of graphitic phase, and AFM to observe morphology and surface roughness. The highest RMS roughness (∼85 nm) was achieved when treating the precursor in He atmosphere. Horizontal microstructures of diamond films were fabricated.

  13. [The 3D-printed dental splint: a valuable tool in the surgical treatment of malocclusion after polytrauma].

    Science.gov (United States)

    van de Velde, W L; Schepers, R H; van Minnen, B

    2016-01-01

    A 22-year old male was referred to the Department of Oral and Maxillofacial Surgery of a university clinic 2 months after he had sustained multiple traumatic injuries abroad because of an anterior malocclusion. The malocclusion was the sequel of an unrecognised, untreated, already consolidated paramedian mandibular fracture on the right and a fracture of the contralateral mandibular angle on the left. Preoperatively, a cobalt-chrome 3D-printed dental splint was prepared. Surgical correction of the malocclusion was carried out by segmental osteotomies of the mandible at the original fracture sites. This involved a vertical paramedian osteotomy on the right side and a unilateral sagittal split osteotomy on the left mandibular angle side. The mandibular segment was mobilised in the correct occlusion with the aid of the 3D-printed dental splint. The splint was fixed to the teeth with dental composite. The custom made 3D-printed dental splint is considered a promising procedural innovation in oral and maxillofacial surgery.

  14. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    Science.gov (United States)

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  15. A case report on inVALUABLE: insect value chain in a circular bioeconomy

    DEFF Research Database (Denmark)

    Heckmann, L.-H.; Andersen, J.L.; Eilenberg, J.

    2018-01-01

    partners span the entire value chain and include entrepreneurs, experts in biology, biotechnology, automation, processing and food tech and safety. This paper provides an overview of the goal, activities and some preliminary results obtained during the first year of the project.......The vision of inVALUABLE is to create a sustainable resource-efficient industry for animal production based on insects. inVALUABLE has focus on the R&D demand for scaling up production of insects in Denmark and assessing the application potential of particularly mealworms. The inVALUABLE consortium...

  16. License to evaluate: Preparing learning analytics dashboards for educational practice

    NARCIS (Netherlands)

    Jivet, Ioana; Scheffel, Maren; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built

  17. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    Science.gov (United States)

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  18. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  19. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  20. Tool for Collaborative Autonomy, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Over the last 25 years, UAS have proven to be very valuable tools for performing a wide range of operations such as environmental disaster relief, search and rescue...

  1. Storytelling: a leadership and educational tool.

    Science.gov (United States)

    Kowalski, Karren

    2015-06-01

    A powerful tool that leaders and educators can use to engage the listeners-both staff and learners-is storytelling. Stories demonstrate important points, valuable lessons, and the behaviors that are preferred by the leader. Copyright 2015, SLACK Incorporated.

  2. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    Science.gov (United States)

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  3. Kidney Paired Donation and the "Valuable Consideration" Problem: The Experiences of Australia, Canada, and the United States.

    Science.gov (United States)

    Toews, Maeghan; Giancaspro, Mark; Richards, Bernadette; Ferrari, Paolo

    2017-09-01

    As organ donation rates remain unable to meet the needs of individuals waiting for transplants, it is necessary to identify reasons for this shortage and develop solutions to address it. The introduction of kidney paired donation (KPD) programs represents one such innovation that has become a valuable tool in donation systems around the world. Although KPD has been successful in increasing kidney donation and transplantation, there are lingering questions about its legality. Donation through KPD is done in exchange for-and with the expectation of-a reciprocal kidney donation and transplantation. It is this reciprocity that has caused concern about whether KPD complies with existing law. Organ donation systems around the world are almost universally structured to legally prohibit the commercial exchange of organs. Australia, Canada, and the United States have accomplished this goal by prohibiting the exchange of an organ for "valuable consideration," which is a legal term that has not historically been limited to monetary exchange. Whether or not KPD programs violate this legislative prohibition will depend on the specific legislative provision being considered, and the legal system and case law of the particular jurisdiction in question. This article compares the experiences of Australia, Canada, and the United States in determining the legality of KPD and highlights the need for legal clarity and flexibility as donation and transplantation systems continue to evolve.

  4. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  5. Analyticity and the Global Information Field

    Directory of Open Access Journals (Sweden)

    Evgeni A. Solov'ev

    2015-03-01

    Full Text Available The relation between analyticity in mathematics and the concept of a global information field in physics is reviewed. Mathematics is complete in the complex plane only. In the complex plane, a very powerful tool appears—analyticity. According to this property, if an analytic function is known on the countable set of points having an accumulation point, then it is known everywhere. This mysterious property has profound consequences in quantum physics. Analyticity allows one to obtain asymptotic (approximate results in terms of some singular points in the complex plane which accumulate all necessary data on a given process. As an example, slow atomic collisions are presented, where the cross-sections of inelastic transitions are determined by branch-points of the adiabatic energy surface at a complex internuclear distance. Common aspects of the non-local nature of analyticity and a recently introduced interpretation of classical electrodynamics and quantum physics as theories of a global information field are discussed.

  6. Nuclear analytical methods: Past, present and future

    International Nuclear Information System (INIS)

    Becker, D.A.

    1996-01-01

    The development of nuclear analytical methods as an analytical tool began in 1936 with the publication of the first paper on neutron activation analysis (NAA). This year, 1996, marks the 60th anniversary of that event. This paper attempts to look back at the nuclear analytical methods of the past, to look around and to see where the technology is right now, and finally, to look ahead to try and see where nuclear methods as an analytical technique (or as a group of analytical techniques) will be going in the future. The general areas which the author focuses on are: neutron activation analysis; prompt gamma neutron activation analysis (PGNAA); photon activation analysis (PAA); charged-particle activation analysis (CPAA)

  7. Library improvement through data analytics

    CERN Document Server

    Farmer, Lesley S J

    2017-01-01

    This book shows how to act on and make sense of data in libraries. Using a range of techniques, tools and methodologies it explains how data can be used to help inform decision making at every level. Sound data analytics is the foundation for making an evidence-based case for libraries, in addition to guiding myriad organizational decisions, from optimizing operations for efficiency to responding to community needs. Designed to be useful for beginners as well as those with a background in data, this book introduces the basics of a six point framework that can be applied to a variety of library settings for effective system based, data-driven management. Library Improvement Through Data Analytics includes: - the basics of statistical concepts - recommended data sources for various library functions and processes, and guidance for using census, university, or - - government data in analysis - techniques for cleaning data - matching data to appropriate data analysis methods - how to make descriptive statistics m...

  8. The Spectrum of Learning Analytics

    Directory of Open Access Journals (Sweden)

    Gerd Kortemeyer

    2017-06-01

    Full Text Available "Learning Analytics" became a buzzword during the hype surrounding the advent of "big data" MOOCs, however, the concept has been around for over two decades. When the first online courses became available it was used as a tool to increase student success in particular courses, frequently combined with the hope of conducting educational research. In recent years, the same term started to be used on the institutional level to increase retention and decrease time-to-degree. These two applications, within particular courses on the one hand and at the institutional level on the other, are at the two extremes of the spectrum of Learning Analytics – and they frequently appear to be worlds apart. The survey describes affordances, theories and approaches in these two categories.

  9. Advanced web metrics with Google Analytics

    CERN Document Server

    Clifton, Brian

    2012-01-01

    Get the latest information about using the #1 web analytics tool from this fully updated guide Google Analytics is the free tool used by millions of web site owners to assess the effectiveness of their efforts. Its revised interface and new features will offer even more ways to increase the value of your web site, and this book will teach you how to use each one to best advantage. Featuring new content based on reader and client requests, the book helps you implement new methods and concepts, track social and mobile visitors, use the new multichannel funnel reporting features, understand which

  10. Novel extractants with high selectivity for valuable metals in seawater. Calixarene derivatives

    International Nuclear Information System (INIS)

    Kakoi, Takahiko; Goto, Masahiro

    1997-01-01

    Seawater contains various valuable metals such as uranium and lithium. Therefore, attempts are being made to develop highly selective extractants which recognize target metal ions in reclaimed seawater. In this review, we have focused our study on the application of novel cyclic compound calixarene based extractants. A novel host compound calixarene, which is a cyclic compound connecting some phenol rings, is capable of forming several different extractant ring sizes and introducing various kinds of functional groups towards targeting of metal ions in seawater. Therefore, calixarene derivatives are capable of extracting valuable metals such as uranium, alkaline metals, heavy metals, rare earth metals and noble metals selectively by varying structural ring size and functional groups. The novel host compound calixarene has given promising results which line it up as a potential extractant for the separation of valuable metal ions in seawater. (author)

  11. Effect of Acid Dissolution Conditions on Recovery of Valuable Metals from Used Plasma Display Panel Scrap

    Directory of Open Access Journals (Sweden)

    Kim Chan-Mi

    2017-06-01

    Full Text Available The objective of this particular study was to recover valuable metals from waste plasma display panels using high energy ball milling with subsequent acid dissolution. Dissolution of milled (PDP powder was studied in HCl, HNO3, and H2SO4 acidic solutions. The effects of dissolution acid, temperature, time, and PDP scrap powder to acid ratio on the leaching process were investigated and the most favorable conditions were found: (1 valuable metals (In, Ag, Mg were recovered from PDP powder in a mixture of concentrated hydrochloric acid (HCl:H2O = 50:50; (2 the optimal dissolution temperature and time for the valuable metals were found to be 60°C and 30 min, respectively; (3 the ideal PDP scrap powder to acid solution ratio was found to be 1:10. The proposed method was applied to the recovery of magnesium, silver, and indium with satisfactory results.

  12. Jet substructure with analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Mrinal [University of Manchester, Consortium for Fundamental Physics, School of Physics and Astronomy, Manchester (United Kingdom); Fregoso, Alessandro; Powling, Alexander [University of Manchester, School of Physics and Astronomy, Manchester (United Kingdom); Marzani, Simone [Durham University, Institute for Particle Physics Phenomenology, Durham (United Kingdom)

    2013-11-15

    We consider the mass distribution of QCD jets after the application of jet-substructure methods, specifically the mass-drop tagger, pruning, trimming and their variants. In contrast to most current studies employing Monte Carlo methods, we carry out analytical calculations at the next-to-leading order level, which are sufficient to extract the dominant logarithmic behaviour for each technique, and compare our findings to exact fixed-order results. Our results should ultimately lead to a better understanding of these jet-substructure methods which in turn will influence the development of future substructure tools for LHC phenomenology. (orig.)

  13. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  14. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  15. Analytical Electron Microscope

    Data.gov (United States)

    Federal Laboratory Consortium — The Titan 80-300 is a transmission electron microscope (TEM) equipped with spectroscopic detectors to allow chemical, elemental, and other analytical measurements to...

  16. Using Psychodynamic Interaction as a Valuable Source of Information in Social Research

    DEFF Research Database (Denmark)

    Schmidt, Camilla

    2012-01-01

    This article will address the issue of using understandings of psychodynamic interrelations as a means to grasp how social and cultural dynamics are processed individually and collectively in narratives. I apply the two theoretically distinct concepts of inter- and intrasubjectivity to gain insight...... are valuable sources of information in understanding the process of becoming a social educator....

  17. Recovery of valuable nitrogen compounds from agricultural liquid wastes: potential possibilities, bottlenecks and future technological challenges.

    NARCIS (Netherlands)

    Rulkens, W.H.; Klapwijk, A.; Willers, H.C.

    1998-01-01

    Agricultural liquid livestock wastes are an important potential source of valuable nitrogen-containing compounds such as ammonia and proteins. Large volumetric quantities of these wastes are produced in areas with a high livestock production density. Much technological research has been carried out

  18. A field guide to valuable underwater aquatic plants of the Great Lakes

    Science.gov (United States)

    Schloesser, Donald W.

    1986-01-01

    Underwater plants are a valuable part of the Great Lakes ecosystem, providing food and shelter for aquatic animals. Aquatic plants also help stabilize sediments, thereby reducing shoreline erosion. Annual fall die-offs of underwater plants provide food and shelter for overwintering small aquatic animals such as insects, snails, and freshwater shrimp.

  19. Nuclear analytical techniques in Cuban Sugar Industry

    International Nuclear Information System (INIS)

    Diaz Riso, O.; Griffith Martinez, J.

    1996-01-01

    This paper is a review concerning the applications of Nuclear Analytical Techniques in the Cuban sugar industry. The most complete elemental composition of final molasses (34 elements ) and natural zeolites (38) this last one employed as an auxiliary agent in sugar technological processes has been performed by means of Instrumental Neutron Activation Analysis (INAA) and X-Ray Fluorescence Analysis (XRFA). The trace elements sugar cane soil plant relationship and elemental composition of different types of Cuban sugar (rawr, blanco directo and refine) were also studied. As a result, valuable information referred to the possibilities of using these products in animal and human foodstuff so as in other applications are given

  20. Constraint-Referenced Analytics of Algebra Learning

    Science.gov (United States)

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  1. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  2. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  3. Signals: Applying Academic Analytics

    Science.gov (United States)

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  4. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  5. Quine's "Strictly Vegetarian" Analyticity

    NARCIS (Netherlands)

    Decock, L.B.

    2017-01-01

    I analyze Quine’s later writings on analyticity from a linguistic point of view. In Word and Object Quine made room for a “strictly vegetarian” notion of analyticity. In later years, he developed this notion into two more precise notions, which I have coined “stimulus analyticity” and “behaviorist

  6. Learning analytics dashboard applications

    NARCIS (Netherlands)

    Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L.

    2013-01-01

    This article introduces learning analytics dashboards that visualize learning traces for learners and teachers. We present a conceptual framework that helps to analyze learning analytics applications for these kinds of users. We then present our own work in this area and compare with 15 related

  7. Learning Analytics Considered Harmful

    Science.gov (United States)

    Dringus, Laurie P.

    2012-01-01

    This essay is written to present a prospective stance on how learning analytics, as a core evaluative approach, must help instructors uncover the important trends and evidence of quality learner data in the online course. A critique is presented of strategic and tactical issues of learning analytics. The approach to the critique is taken through…

  8. Analytical mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  9. Analytical mass spectrometry. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-31

    This 43rd Annual Summer Symposium on Analytical Chemistry was held July 24--27, 1990 at Oak Ridge, TN and contained sessions on the following topics: Fundamentals of Analytical Mass Spectrometry (MS), MS in the National Laboratories, Lasers and Fourier Transform Methods, Future of MS, New Ionization and LC/MS Methods, and an extra session. (WET)

  10. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  11. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  12. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    Energy Technology Data Exchange (ETDEWEB)

    Ragan, Eric D [ORNL; Goodall, John R [ORNL

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  13. Concept of an integrated waste economy represented on the example of recycling of valuable materials

    Energy Technology Data Exchange (ETDEWEB)

    Wender, H

    1980-08-01

    The historical development of waste elimination is discussed, followed by the waste problem in an environmental discussion, the possibilities of recycling within the framework of a waste industry, and the solution of the waste problem from a waste-economy viewpoint, including the definition of 'waste' and the grouping by types of waste, their amounts and increase rates, composition and valuable materials in community wastes with a review of waste technologies under waste-economy viewpoints. This is followed by a discussion of the sales possibilities for valuable components from mechanical sorting facilities, including used paper, old glass, hard substances, metals, plastics, succeeded by a comparative evaluation method, and the national economy aspect of the waste industry, with the savings effect in raw materials for different branches, effects on raw material reserves, the problem of dependence on imports, waste rates and living standard, and the importance of environmental instruments which are discussed in detail.

  14. Mango (Mangifera indica L.) by-products and their valuable components: a review.

    Science.gov (United States)

    Jahurul, M H A; Zaidul, I S M; Ghafoor, Kashif; Al-Juhaimi, Fahad Y; Nyam, Kar-Lin; Norulaini, N A N; Sahena, F; Mohd Omar, A K

    2015-09-15

    The large amount of waste produced by the food industries causes serious environmental problems and also results in economic losses if not utilized effectively. Different research reports have revealed that food industry by-products can be good sources of potentially valuable bioactive compounds. As such, the mango juice industry uses only the edible portions of the mangoes, and a considerable amount of peels and seeds are discarded as industrial waste. These mango by-products come from the tropical or subtropical fruit processing industries. Mango by-products, especially seeds and peels, are considered to be cheap sources of valuable food and nutraceutical ingredients. The main uses of natural food ingredients derived from mango by-products are presented and discussed, and the mainstream sectors of application for these by-products, such as in the food, pharmaceutical, nutraceutical and cosmetic industries, are highlighted. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The cost of being valuable: predictors of extinction risk in marine invertebrates exploited as luxury seafood

    OpenAIRE

    Purcell, Steven W.; Polidoro, Beth A.; Hamel, Jean-François; Gamboa, Ruth U.; Mercier, Annie

    2014-01-01

    Extinction risk has been linked to biological and anthropogenic variables. Prediction of extinction risk in valuable fauna may not follow mainstream drivers when species are exploited for international markets. We use results from an International Union for Conservation of Nature Red List assessment of extinction risk in all 377 known species of sea cucumber within the order Aspidochirotida, many of which are exploited worldwide as luxury seafood for Asian markets. Extinction risk was primari...

  16. Production of Fatty Acid-Derived Valuable Chemicals in Synthetic Microbes

    International Nuclear Information System (INIS)

    Yu, Ai-Qun; Pratomo Juwono, Nina Kurniasih; Leong, Susanna Su Jan; Chang, Matthew Wook

    2014-01-01

    Fatty acid derivatives, such as hydroxy fatty acids, fatty alcohols, fatty acid methyl/ethyl esters, and fatty alka(e)nes, have a wide range of industrial applications including plastics, lubricants, and fuels. Currently, these chemicals are obtained mainly through chemical synthesis, which is complex and costly, and their availability from natural biological sources is extremely limited. Metabolic engineering of microorganisms has provided a platform for effective production of these valuable biochemicals. Notably, synthetic biology-based metabolic engineering strategies have been extensively applied to refactor microorganisms for improved biochemical production. Here, we reviewed: (i) the current status of metabolic engineering of microbes that produce fatty acid-derived valuable chemicals, and (ii) the recent progress of synthetic biology approaches that assist metabolic engineering, such as mRNA secondary structure engineering, sensor-regulator system, regulatable expression system, ultrasensitive input/output control system, and computer science-based design of complex gene circuits. Furthermore, key challenges and strategies were discussed. Finally, we concluded that synthetic biology provides useful metabolic engineering strategies for economically viable production of fatty acid-derived valuable chemicals in engineered microbes.

  17. Production of Fatty Acid-Derived Valuable Chemicals in Synthetic Microbes

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Ai-Qun; Pratomo Juwono, Nina Kurniasih [Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (Singapore); Synthetic Biology Research Program, National University of Singapore, Singapore (Singapore); Leong, Susanna Su Jan [Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (Singapore); Synthetic Biology Research Program, National University of Singapore, Singapore (Singapore); Singapore Institute of Technology, Singapore (Singapore); Chang, Matthew Wook, E-mail: bchcmw@nus.edu.sg [Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (Singapore); Synthetic Biology Research Program, National University of Singapore, Singapore (Singapore)

    2014-12-23

    Fatty acid derivatives, such as hydroxy fatty acids, fatty alcohols, fatty acid methyl/ethyl esters, and fatty alka(e)nes, have a wide range of industrial applications including plastics, lubricants, and fuels. Currently, these chemicals are obtained mainly through chemical synthesis, which is complex and costly, and their availability from natural biological sources is extremely limited. Metabolic engineering of microorganisms has provided a platform for effective production of these valuable biochemicals. Notably, synthetic biology-based metabolic engineering strategies have been extensively applied to refactor microorganisms for improved biochemical production. Here, we reviewed: (i) the current status of metabolic engineering of microbes that produce fatty acid-derived valuable chemicals, and (ii) the recent progress of synthetic biology approaches that assist metabolic engineering, such as mRNA secondary structure engineering, sensor-regulator system, regulatable expression system, ultrasensitive input/output control system, and computer science-based design of complex gene circuits. Furthermore, key challenges and strategies were discussed. Finally, we concluded that synthetic biology provides useful metabolic engineering strategies for economically viable production of fatty acid-derived valuable chemicals in engineered microbes.

  18. World`s Most Valuable Brand Resonation With Categories of Different Customer Needs

    Directory of Open Access Journals (Sweden)

    Kaspars VIKSNE

    2017-09-01

    Full Text Available One of the key performance indicators of brand success is its value. Brand value is an outcome of brand`s performance in market, and is largely depended from brand`s ability to satisfy certain customer needs. For the greatest success in the world`s market brand should resonate its ability to satisfy some of customer`s most universal needs. In this paper authors strives to find out which of the needs world`s most successful brands are resonating with. Therefore paper goal is to is to determine what customer needs world`s most valuable brands are primarily satisfying. First part of paper authors briefly evaluate Maslow theory of needs. In second part of paper authors identify main challenges of brand valuation, and briefly describe today`s most valuable brands. In third part of paper authors analyzes if resonating certain human need in brand makes it to be more valuable. In last part of paper authors summarizes the main findings and gives recommendations for better marketing practices to other brands whose owners have high market ambitions. In order to attain the paper`s goal, authors will use following research methods: Comparative analysis for comparing brands in different brand rankings; Content analysis for determining what need satisfaction brand advertisements resonate; Data analysis for quantify the results gathered from content analysis

  19. ANALYTICAL ANARCHISM: THE PROBLEM OF DEFINITION AND DEMARCATION

    OpenAIRE

    Konstantinov M.S.

    2012-01-01

    In this paper the first time in the science of our country is considered a new trend of anarchist thought - analytical anarchism. As a methodological tool used critical analysis of the key propositions of the basic versions of this trend: the anarcho- capitalist and egalitarian. The study was proposed classification of discernible trends within the analytical anarchism on the basis of value criteria, identified conceptual and methodological problems of definition analytical anarchism and its ...

  20. Radionuclides in analytical chemistry

    International Nuclear Information System (INIS)

    Tousset, J.

    1984-01-01

    Applications of radionuclides in analytical chemistry are reviewed in this article: tracers, radioactive sources and activation analysis. Examples are given in all these fields and it is concluded that these methods should be used more widely [fr

  1. Mobility Data Analytics Center.

    Science.gov (United States)

    2016-01-01

    Mobility Data Analytics Center aims at building a centralized data engine to efficiently manipulate : large-scale data for smart decision making. Integrating and learning the massive data are the key to : the data engine. The ultimate goal of underst...

  2. Analytical strategies for phosphoproteomics

    DEFF Research Database (Denmark)

    Thingholm, Tine E; Jensen, Ole N; Larsen, Martin R

    2009-01-01

    sensitive and specific strategies. Today, most phosphoproteomic studies are conducted by mass spectrometric strategies in combination with phospho-specific enrichment methods. This review presents an overview of different analytical strategies for the characterization of phosphoproteins. Emphasis...

  3. Modeling of the Global Water Cycle - Analytical Models

    Science.gov (United States)

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  4. Intermediate algebra & analytic geometry

    CERN Document Server

    Gondin, William R

    1967-01-01

    Intermediate Algebra & Analytic Geometry Made Simple focuses on the principles, processes, calculations, and methodologies involved in intermediate algebra and analytic geometry. The publication first offers information on linear equations in two unknowns and variables, functions, and graphs. Discussions focus on graphic interpretations, explicit and implicit functions, first quadrant graphs, variables and functions, determinate and indeterminate systems, independent and dependent equations, and defective and redundant systems. The text then examines quadratic equations in one variable, system

  5. A GIS-based Spatial Decision Support System for environmentally valuable areas in the context of sustainable development of Poland

    Science.gov (United States)

    Kubacka, Marta

    2013-04-01

    The issue of spatial development, and thus proper environmental management and protection at naturally valuable areas is today considered a major hazard to the stability of the World ecological system. The increasing demand for areas with substantial environmental and landscape assets, incorrect spatial development, improper implementation of law as well as low citizen awareness bring about significant risk of irrevocable loss of naturally valuable areas. The elaboration of a Decision Support System in the form of collection of spatial data will facilitate solving complex problems concerning spatial development. The elaboration of a model utilizing a number of IT tools will boost the effectiveness of taking spatial decisions by decision-makers. Proper spatial data management becomes today a key element in management based on knowledge, namely sustainable development. Decision Support Systems are definied as model-based sets of procedures for processing data and judgments to assist a manager in his decision-making. The main purpose of the project was to elaborate the spatial decision support system for the Sieraków Landscape Park. A landscape park in Poland comprises a protected area due to environmental, historic and cultural values as well as landscape assets for the purpose of maintaining and popularizing these values in the conditions of sustainable development. It also defines the forms of protected area management and introduces bans concerning activity at these areas by means of the obligation to prepare and implement environmental protection plans by a director of the complex of landscape parks. As opposed to national parks and reserves, natural landscape parks are not the areas free from economic activity, thus agricultural lands, forest lands and other real properties located within the boundaries of natural landscape parks are subject to economic utilization Research area was subject to the analysis with respect to the implementation of investment

  6. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  7. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  8. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  9. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  10. Germ cell transplantation using sexually competent fish: an approach for rapid propagation of endangered and valuable germlines.

    Directory of Open Access Journals (Sweden)

    Sullip K Majhi

    Full Text Available The transplantation of germ cells into adult recipient gonads is a tool with wide applications in animal breeding and conservation of valuable and/or endangered species; it also provides a means for basic studies involving germ cell (GC proliferation and differentiation. Here we describe the establishment of a working model for xenogeneic germ cell transplantation (GCT in sexually competent fish. Spermatogonial cells isolated from juveniles of one species, the pejerrey Odontesthes bonariensis (Atherinopsidae, were surgically transplanted into the gonads of sexually mature Patagonian pejerrey O. hatcheri, which have been partially depleted of endogenous GCs by a combination of Busulfan (40 mg/kg and high water temperature (25 degrees C treatments. The observation of the donor cells' behavior showed that transplanted spermatogonial cells were able to recolonize the recipients' gonads and resume spermatogenesis within 6 months from the GCT. The presence of donor-derived gametes was confirmed by PCR in 20% of the surrogate O. hatcheri fathers at 6 months and crosses with O. bonariensis mothers produced hybrids and pure O. bonariensis, with donor-derived germline transmission rates of 1.2-13.3%. These findings indicate that transplantation of spermatogonial cells into sexually competent fish can shorten considerably the production time of donor-derived gametes and offspring and could play a vital role in germline conservation and propagation of valued and/or endangered fish species.

  11. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  12. The Consortium for the Valuation of Applications Benefits Linked with Earth Science (VALUABLES)

    Science.gov (United States)

    Kuwayama, Y.; Mabee, B.; Wulf Tregar, S.

    2017-12-01

    National and international organizations are placing greater emphasis on the societal and economic benefits that can be derived from applications of Earth observations, yet improvements are needed to connect to the decision processes that produce actions with direct societal benefits. There is a need to substantiate the benefits of Earth science applications in socially and economically meaningful terms in order to demonstrate return on investment and to prioritize investments across data products, modeling capabilities, and information systems. However, methods and techniques for quantifying the value proposition of Earth observations are currently not fully established. Furthermore, it has been challenging to communicate the value of these investments to audiences beyond the Earth science community. The Consortium for the Valuation of Applications Benefits Linked with Earth Science (VALUABLES), a cooperative agreement between Resources for the Future (RFF) and the National Aeronautics and Space Administration (NASA), has the goal of advancing methods for the valuation and communication of the applied benefits linked with Earth observations. The VALUABLES Consortium will focus on three pillars: (a) a research pillar that will apply existing and innovative methods to quantify the socioeconomic benefits of information from Earth observations; (b) a capacity building pillar to catalyze interdisciplinary linkages between Earth scientists and social scientists; and (c) a communications pillar that will convey the value of Earth observations to stakeholders in government, universities, the NGO community, and the interested public. In this presentation, we will describe ongoing and future activities of the VALUABLES Consortium, provide a brief overview of frameworks to quantify the socioeconomic value of Earth observations, and describe how Earth scientists and social scientist can get involved in the Consortium's activities.

  13. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  14. Activating Processes in the Brand Communication of Valuable Brands on the example of Coca-Cola.

    OpenAIRE

    Pöhler, Marie-Luise

    2017-01-01

    Everyone in the world, from the streets of Paris to the villages in Africa, knows the logo with the white letters that are written on a bright red background. Coca-Cola was introduced in 1886. In that year, only nine glasses of the soda drink were sold per day. So how did the little company from Atlanta become the world’s most valuable and popular soft drink? One of the company’s secrets is its emotional and memorable advertising strategies. Therefore, this thesis explains and analyzes ho...

  15. Determination of commercially valuable characteristics of plant varieties for energetic use during the state examination

    Directory of Open Access Journals (Sweden)

    В. В. Баликіна

    2014-12-01

    Full Text Available The analysis of commercially valuable indices of plant varieties for energetic use was carried out and the necessity to determine energetic indices during the state scientific-and-technical examination is substantiated. In order to explain the requirements for registration of new varieties of energy crops concerning the defi nition of indices of ability for distribution, the collection of species and hybrid forms of willow was used. Factors that prove the economic and environmental advantages of energy willow cultivation for biofuel are specifi ed.

  16. The intrapreneur: A distinct and valuable role to be institutionalized and strategically managed

    DEFF Research Database (Denmark)

    Ashourizadeh, Shayegheh; Schøtt, Thomas

    are distinct from routine employees and somewhat similar to entrepreneurs. Thereby intrapreneurs are a human resource that by developing new activities for their employer and also by creating new jobs is very valuable. – The rate of intrapreneurship among employees is higher in Denmark than in almost all other......, especially in Denmark, to adopt strategies for institutionalization and management of this human resource....... more frequently than routine employees are self-efficacious, opportunity-perceiving, risk-willing and role-modeling starters, have meaningful and autonomous jobs, and are satisfied with their jobs and salary, but also experience more stress in work; and in these job-characteristics intrapreneurs...

  17. Process for the extraction of valuable products from coals, pitches, mineral oils, and the like

    Energy Technology Data Exchange (ETDEWEB)

    1936-06-05

    A process is described for the treating of coke, lignite, peat, etc., and mineral oils with the help of hydrogen or other reducing gases and under pressure to recover valuable hydrocarbons, characterized by the carbonaceous substances and the reducing gas coming together already heated totally or in part at least from 350/sup 0/C to the temperature necessary for the reaction. The substances to be treated becoming extracted in the form of paste or liquid from the reaction chamber and then returned to it and being reacted outside the reaction zone in the presence of the reducing gases at the temperature necessary for the reaction.

  18. Inventory of species and cultivars potentially valuable for forest/biomass production

    Energy Technology Data Exchange (ETDEWEB)

    Lavoie, G

    1981-01-01

    To prepare a guide for experiments in mini-rotation or short rotation forest production, potentially valuable species and cultivars have been inventoried. In this text, 288 species are listed under 31 genera, 27 deciduous and 4 coniferous. This partial inventory was made for the Northern Hemisphere and different climates, ranging from the tropical zone to the cold temperate zone. To be included a species had to satisfy the following conditions: ease of established and rapid juvenile growth. The list of species and cultivars is given in alphabetical order. 55 references.

  19. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    Science.gov (United States)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  20. Analytical solutions of nonlocal Poisson dielectric models with multiple point charges inside a dielectric sphere

    Science.gov (United States)

    Xie, Dexuan; Volkmer, Hans W.; Ying, Jinyong

    2016-04-01

    The nonlocal dielectric approach has led to new models and solvers for predicting electrostatics of proteins (or other biomolecules), but how to validate and compare them remains a challenge. To promote such a study, in this paper, two typical nonlocal dielectric models are revisited. Their analytical solutions are then found in the expressions of simple series for a dielectric sphere containing any number of point charges. As a special case, the analytical solution of the corresponding Poisson dielectric model is also derived in simple series, which significantly improves the well known Kirkwood's double series expansion. Furthermore, a convolution of one nonlocal dielectric solution with a commonly used nonlocal kernel function is obtained, along with the reaction parts of these local and nonlocal solutions. To turn these new series solutions into a valuable research tool, they are programed as a free fortran software package, which can input point charge data directly from a protein data bank file. Consequently, different validation tests can be quickly done on different proteins. Finally, a test example for a protein with 488 atomic charges is reported to demonstrate the differences between the local and nonlocal models as well as the importance of using the reaction parts to develop local and nonlocal dielectric solvers.

  1. Analytical methods in rotor dynamics

    CERN Document Server

    Dimarogonas, Andrew D; Chondros, Thomas G

    2013-01-01

    The design and construction of rotating machinery operating at supercritical speeds was, in the 1920s, an event of revolutionary importance for the then new branch of dynamics known as rotor dynamics. In the 1960s, another revolution occurred: In less than a decade, imposed by operational and economic needs, an increase in the power of turbomachinery by one order of magnitude took place. Dynamic analysis of complex rotor forms became a necessity, while the importance of approximate methods for dynamic analysis was stressed. Finally, the emergence of fracture mechanics, as a new branch of applied mechanics, provided analytical tools to investigate crack influence on the dynamic behavior of rotors. The scope of this book is based on all these developments. No topics related to the well-known classical problems are included, rather the book deals exclusively with modern high-power turbomachinery.

  2. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  3. Information theory in analytical chemistry

    National Research Council Canada - National Science Library

    Eckschlager, Karel; Danzer, Klaus

    1994-01-01

    Contents: The aim of analytical chemistry - Basic concepts of information theory - Identification of components - Qualitative analysis - Quantitative analysis - Multicomponent analysis - Optimum analytical...

  4. Valuable books from the library of Paul Gore (Identification and/or inventory of sources

    Directory of Open Access Journals (Sweden)

    Maria Danilov

    2013-12-01

    Full Text Available Paul Gore (1875-1927 - an outstanding figure of the socio-political, scientific and cultural life of Bessarabia at the beginning of 20th century, was also known among his contemporaries as a keen collector of old and rare books. Undoubtedly, the most valuable part of the library of Paul Gore consisted of books on the history of Bessarabia. Documents from the National Archives of the Republic of Moldova in Chişinău confirm that he inherited a large part of books from his father Gheorghe Gore (1839-1909. A study of the Paul Gore Fund at the National Archives of Romania in Bucharest gave us a lot of documentary evidence of the destiny of this Bessarabian noble library, which later became a property of the King Ferdinand Fund. However, the fate of its most valuable part - books on the history of Bessarabia, consisted of 651 units of the total number of 6456 volumes - is still unknown.

  5. An alternative approach to recovering valuable metals from zinc phosphating sludge.

    Science.gov (United States)

    Kuo, Yi-Ming

    2012-01-30

    This study used a vitrification process (with good potential for commercialization) to recover valuable metals from Zn phosphating sludge. The involved vitrification process achieves two major goals: it transformed hazardous Zn phosphating sludge into inert slag and it concentrated Fe (83.5%) and Zn (92.8%) into ingot and fine particulate-phase material, respectively. The Fe content in the ingot was 278,000 mg/kg, making the ingot a potential raw material for iron making. The fine particulate-phase material (collected from flue gas) contained abundant Zn (544,000 mg/kg) in the form of ZnO. The content (67.7%) of ZnO was high, so it can be directly sold to refineries. The recovered coarse particulate-phase material, with insufficient amount of ZnO, can be recycled as a feeding material for Zn re-concentration. Therefore, the vitrification process can not only treat hazardous materials but also effectively recover valuable metals. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Deep cleaning of a metallurgical zinc leaching residue and recovery of valuable metals

    Science.gov (United States)

    Xing, Peng; Ma, Bao-zhong; Zeng, Peng; Wang, Cheng-yan; Wang, Ling; Zhang, Yong-lu; Chen, Yong-qiang; Wang, Shuo; Wang, Qiu-yin

    2017-11-01

    Huge quantities of zinc leaching residues (ZLRs) generated from zinc production are dumped continuously around the world and pose a potential environmental threat because of their considerable amounts of entrained heavy metals (mainly lead). Most ZLRs have not been properly treated and the valuable metals in them have not yet been effectively recovered. Herein, the deep cleaning of a ZLR and recovery of valuable metals via a hydrometallurgical route were investigated. The cleaning process consists of two essential stages: acid leaching followed by calcium chloride leaching. The optimum conditions for extracting zinc, copper, and indium by acid leaching were a sulfuric acid concentration of 200 g·L-1, a liquid/solid ratio of 4:1 (mL/g), a leaching time of 2 h, and a temperature of 90°C. For lead and silver extractions, the optimum conditions were a calcium chloride concentration of 400 g·L-1, a pH value of 1.0, a leaching time of 1 h, and a temperature of 30°C. After calcium chloride leaching, silver and lead were extracted out and the lead was finally recovered as electrolytic lead by electrowinning. The anglesite phase, which poses the greatest potential environmental hazard, was removed from the ZLR after deep cleaning, thus reducing the cost of environmental management of ZLRs. The treatment of chlorine and spent electrolyte generated in the process was discussed.

  7. Analytic plane wave solutions for the quaternionic potential step

    International Nuclear Information System (INIS)

    De Leo, Stefano; Ducati, Gisele C.; Madureira, Tiago M.

    2006-01-01

    By using the recent mathematical tools developed in quaternionic differential operator theory, we solve the Schroedinger equation in the presence of a quaternionic step potential. The analytic solution for the stationary states allows one to explicitly show the qualitative and quantitative differences between this quaternionic quantum dynamical system and its complex counterpart. A brief discussion on reflected and transmitted times, performed by using the stationary phase method, and its implication on the experimental evidence for deviations of standard quantum mechanics is also presented. The analytic solution given in this paper represents a fundamental mathematical tool to find an analytic approximation to the quaternionic barrier problem (up to now solved by numerical method)

  8. Competing on talent analytics.

    Science.gov (United States)

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  9. Advanced business analytics

    CERN Document Server

    Lev, Benjamin

    2015-01-01

    The book describes advanced business analytics and shows how to apply them to many different professional areas of engineering and management. Each chapter of the book is contributed by a different author and covers a different area of business analytics. The book connects the analytic principles with business practice and provides an interface between the main disciplines of engineering/technology and the organizational, administrative and planning abilities of management. It also refers to other disciplines such as economy, finance, marketing, behavioral economics and risk analysis. This book is of special interest to engineers, economists and researchers who are developing new advances in engineering management but also to practitioners working on this subject.

  10. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  11. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  12. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  13. Social network data analytics

    CERN Document Server

    Aggarwal, Charu C

    2011-01-01

    Social network analysis applications have experienced tremendous advances within the last few years due in part to increasing trends towards users interacting with each other on the internet. Social networks are organized as graphs, and the data on social networks takes on the form of massive streams, which are mined for a variety of purposes. Social Network Data Analytics covers an important niche in the social network analytics field. This edited volume, contributed by prominent researchers in this field, presents a wide selection of topics on social network data mining such as Structural Pr

  14. News for analytical chemists

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Karlberg, Bo

    2009-01-01

    welfare. In conjunction with the meeting of the steering committee in Tallinn, Estonia, in April, Mihkel Kaljurand and Mihkel Koel of Tallinn University of Technology organised a successful symposium attended by 51 participants. The symposium illustrated the scientific work of the steering committee...... directed to various topics of analytical chemistry. Although affected by the global financial crisis, the Euroanalysis Conference will be held on 6 to 10 September in Innsbruck, Austria. For next year, the programme for the analytical section of the 3rd European Chemistry Congress is in preparation...

  15. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  16. Supercritical fluid analytical methods

    International Nuclear Information System (INIS)

    Smith, R.D.; Kalinoski, H.T.; Wright, B.W.; Udseth, H.R.

    1988-01-01

    Supercritical fluids are providing the basis for new and improved methods across a range of analytical technologies. New methods are being developed to allow the detection and measurement of compounds that are incompatible with conventional analytical methodologies. Characterization of process and effluent streams for synfuel plants requires instruments capable of detecting and measuring high-molecular-weight compounds, polar compounds, or other materials that are generally difficult to analyze. The purpose of this program is to develop and apply new supercritical fluid techniques for extraction, separation, and analysis. These new technologies will be applied to previously intractable synfuel process materials and to complex mixtures resulting from their interaction with environmental and biological systems

  17. Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning tools like Spark, Jupyter, R, S...

  18. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  19. Customer Intelligence Analytics on Social Networks

    Directory of Open Access Journals (Sweden)

    Brano MARKIĆ

    2016-08-01

    Full Text Available Discovering needs, habits and consumer behavior is the primary task of marketing analytics. It is necessary to integrate marketing and analytical skills with IT skills. Such knowledge integration allows access to data (structured and unstructured, their analysis and finding out information about the opinions, attitudes, needs and behavior of customers. In the paper is set the hypothesis that software tools can collect data (messages from social networks, analyze the content of messages and get to know the attitudes of customers about a product, service, tourist destination with the ultimate goal of improving customer relations. Experimental results are based on the analysis of the content of social network Facebook by using the package and function R language. This language showed a satisfactory application and development power in analysis of textual data on social networks for marketing analytics.

  20. Data Analytics in CRM Processes: A Literature Review

    Directory of Open Access Journals (Sweden)

    Gončarovs Pāvels

    2017-12-01

    Full Text Available Nowadays, the data scarcity problem has been supplanted by the data deluge problem. Marketers and Customer Relationship Management (CRM specialists have access to rich data on consumer behaviour. The current challenge is effective utilisation of these data in CRM processes and selection of appropriate data analytics techniques. Data analytics techniques help find hidden patterns in data. The present paper explores the characteristics of data analytics as the integrated tool in CRM for sales managers. The paper aims at analysing some of the different analytics methods and tools which can be used for continuous improvement of CRM processes. A systematic literature has been conducted to achieve this goal. The results of the review highlight the most frequently considered CRM processes in the context of data analytics.