WorldWideScience

Sample records for promises analytical usefulness

  1. Big data analytics in healthcare: promise and potential.

    Science.gov (United States)

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  2. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    Directory of Open Access Journals (Sweden)

    Brian H Shirts

    2015-01-01

    Full Text Available The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.

  3. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  4. Behavioural health analytics using mobile phones

    Directory of Open Access Journals (Sweden)

    P. Wlodarczak

    2015-07-01

    Full Text Available Big Data analytics in healthcare has become a very active area of research since it promises to reduce costs and to improve health care quality. Behavioural analytics analyses a patients behavioural patterns with the goal of early detection if a patient becomes symptomatic and triggering treatment even before a disease outbreak happens. Behavioural analytics allows a more precise and personalised treatment and can even monitor whole populations for events such as epidemic outbreaks. With the prevalence of mobile phones, they have been used to monitor the health of patients by analysing their behavioural and movement patterns. Cell phones are always on devices and are usually close to their users. As such they can be used as social sensors to create "automated diaries" of their users. Specialised apps passively collect and analyse user data to detect if a patient shows some deviant behaviour indicating he has become symptomatic. These apps first learn a patients normal daily patterns and alert a health care centre if it detects a deviant behaviour. The health care centre can then call the patient and check on his well-being. These apps use machine learning techniques to for reality mining and predictive analysis. This paper describes some of these techniques that have been adopted recently in eHealth apps.

  5. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    Science.gov (United States)

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  6. The Promise and Peril of Predictive Analytics in Higher Education: A Landscape Analysis

    Science.gov (United States)

    Ekowo, Manuela; Palmer, Iris

    2016-01-01

    Predictive analytics in higher education is a hot-button topic among educators and administrators as institutions strive to better serve students by becoming more data-informed. In this paper, the authors describe how predictive analytics are used in higher education to identify students who need extra support, steer students in courses they will…

  7. Many-core graph analytics using accelerated sparse linear algebra routines

    Science.gov (United States)

    Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric

    2016-05-01

    Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.

  8. Laser-based analytical monitoring in nuclear-fuel processing plants

    International Nuclear Information System (INIS)

    Hohimer, J.P.

    1978-09-01

    The use of laser-based analytical methods in nuclear-fuel processing plants is considered. The species and locations for accountability, process control, and effluent control measurements in the Coprocessing, Thorex, and reference Purex fuel processing operations are identified and the conventional analytical methods used for these measurements are summarized. The laser analytical methods based upon Raman, absorption, fluorescence, and nonlinear spectroscopy are reviewed and evaluated for their use in fuel processing plants. After a comparison of the capabilities of the laser-based and conventional analytical methods, the promising areas of application of the laser-based methods in fuel processing plants are identified

  9. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    Science.gov (United States)

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  10. Big Data and Predictive Analytics in Health Care.

    Science.gov (United States)

    Dhar, Vasant

    2014-09-01

    Predictive analytics show great promise in health care but face some serious hurdles for widespread adoption. I discuss the state of the art of predictive health-care analytics using the clinical arena as an example and discuss how the outputs of predictive systems could be made actionable through differentiated processes that encourage prevention. Such systems have the potential to minimize health risk at the population and individual levels through more personalized health-care delivery.

  11. Features Students Really Expect from Learning Analytics

    Science.gov (United States)

    Schumacher, Clara; Ifenthaler, Dirk

    2016-01-01

    In higher education settings more and more learning is facilitated through online learning environments. To support and understand students' learning processes better, learning analytics offers a promising approach. The purpose of this study was to investigate students' expectations toward features of learning analytics systems. In a first…

  12. Using linked data in Learning Analytics

    NARCIS (Netherlands)

    Mathieu, d'Aquin; Stefan, Dietze; Eelco, Herder; Drachsler, Hendrik; Davide, Taibi

    2014-01-01

    d’Aquin, M., Dietze, S., Herder, E., Drachsler, H., & Taibi, D. (2014). Using linked data in learning analytics. eLearning Papers. Nr. 36/2. ISSN: 1887-1542. http://www.openeducationeuropa.eu/en/article/Using-linked-data-in-Learning-Analytics?paper=134810

  13. The use of cryogenic helium for classical turbulence: Promises and hurdles

    International Nuclear Information System (INIS)

    Niemela, J.J.; Sreenivasan, K.R.

    2006-12-01

    Fluid turbulence is a paradigm for non-linear systems with many degrees of freedom and important in numerous applications. Because the analytical understanding of the equations of motion is poor, experiments and, lately, direct numerical simulations of the equations of motion, have been fundamental to making progress. In this vein, a concerted experimental effort has been made to take advantage of the unique properties of liquid and gaseous helium at low temperatures near or below the critical point. We discuss the promise and impact of results from recent helium experiments and identify the current technical barriers which can perhaps be removed by low temperature researchers. We focus mainly on classical flows that utilize helium above the lambda line, but touch on those aspects below that exhibit quasi-classical behavior. (author)

  14. 3D Printed Paper-Based Microfluidic Analytical Devices

    Directory of Open Access Journals (Sweden)

    Yong He

    2016-06-01

    Full Text Available As a pump-free and lightweight analytical tool, paper-based microfluidic analytical devices (μPADs attract more and more interest. If the flow speed of μPAD can be programmed, the analytical sequences could be designed and they will be more popular. This reports presents a novel μPAD, driven by the capillary force of cellulose powder, printed by a desktop three-dimensional (3D printer, which has some promising features, such as easy fabrication and programmable flow speed. First, a suitable size-scale substrate with open microchannels on its surface is printed. Next, the surface of the substrate is covered with a thin layer of polydimethylsiloxane (PDMS to seal the micro gap caused by 3D printing. Then, the microchannels are filled with a mixture of cellulose powder and deionized water in an appropriate proportion. After drying in an oven at 60 °C for 30 min, it is ready for use. As the different channel depths can be easily printed, which can be used to achieve the programmable capillary flow speed of cellulose powder in the microchannels. A series of microfluidic analytical experiments, including quantitative analysis of nitrite ion and fabrication of T-sensor were used to demonstrate its capability. As the desktop 3D printer (D3DP is very cheap and accessible, this device can be rapidly printed at the test field with a low cost and has a promising potential in the point-of-care (POC system or as a lightweight platform for analytical chemistry.

  15. Using Linked Data in Learning Analytics

    NARCIS (Netherlands)

    d'Aquin, Mathieu; Dietze, Stefan; Drachsler, Hendrik; Herder, Eelco

    2013-01-01

    d'Aquin, M., Dietze, S., Drachsler, H., & Herder, E. (2013, April). Using Linked Data in Learning Analytics. Tutorial given at LAK 2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  16. Designing for Student-Facing Learning Analytics

    Science.gov (United States)

    Kitto, Kirsty; Lupton, Mandy; Davis, Kate; Waters, Zak

    2017-01-01

    Despite a narrative that sees learning analytics (LA) as a field that aims to enhance student learning, few student-facing solutions have emerged. This can make it difficult for educators to imagine how data can be used in the classroom, and in turn diminishes the promise of LA as an enabler for encouraging important skills such as sense-making,…

  17. Decisions through data: analytics in healthcare.

    Science.gov (United States)

    Wills, Mary J

    2014-01-01

    The amount of data in healthcare is increasing at an astonishing rate. However, in general, the industry has not deployed the level of data management and analysis necessary to make use of those data. As a result, healthcare executives face the risk of being overwhelmed by a flood of unusable data. In this essay I argue that, in order to extract actionable information, leaders must take advantage of the promise of data analytics. Small data, predictive modeling expansion, and real-time analytics are three forms of data analytics. On the basis of my analysis for this study, I recommend all three for adoption. Recognizing the uniqueness of each organization's situation, I also suggest that practices, hospitals, and healthcare systems examine small data and conduct real-time analytics and that large-scale organizations managing populations of patients adopt predictive modeling. I found that all three solutions assist in the collection, management, and analysis of raw data to improve the quality of care and decrease costs.

  18. Group Analytic Psychotherapy in Brazil.

    Science.gov (United States)

    Penna, Carla; Castanho, Pablo

    2015-10-01

    Group analytic practice in Brazil began quite early. Highly influenced by the Argentinean Pichon-Rivière, it enjoyed a major development from the 1950s to the early 1980s. Beginning in the 1970s, different factors undermined its development and eventually led to its steep decline. From the mid 1980s on, the number of people looking for either group analytic psychotherapy or group analytic training decreased considerably. Group analytic psychotherapy societies struggled to survive and most of them had to close their doors in the 1990s and the following decade. Psychiatric reform and the new public health system have stimulated a new demand for groups in Brazil. Developments in the public and not-for-profit sectors, combined with theoretical and practical research in universities, present promising new perspectives for group analytic psychotherapy in Brazil nowadays.

  19. Autonomic urban traffic optimization using data analytics

    OpenAIRE

    Garriga Porqueras, Albert

    2017-01-01

    This work focuses on a smart mobility use case where real-time data analytics on traffic measures is used to improve mobility in the event of a perturbation causing congestion in a local urban area. The data monitored is analysed in order to identify patterns that are used to properly reconfigure traffic lights. The monitoring and data analytics infrastructure is based on a hierarchical distributed architecture that allows placing data analytics processes such as machine learning close to the...

  20. Guided Text Search Using Adaptive Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Symons, Christopher T [ORNL; Senter, James K [ORNL; DeNap, Frank A [ORNL

    2012-10-01

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interacts with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.

  1. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  2. Comparison of maximum runup through analytical and numerical approaches for different fault parameters estimates

    Science.gov (United States)

    Kanoglu, U.; Wronna, M.; Baptista, M. A.; Miranda, J. M. A.

    2017-12-01

    The one-dimensional analytical runup theory in combination with near shore synthetic waveforms is a promising tool for tsunami rapid early warning systems. Its application in realistic cases with complex bathymetry and initial wave condition from inverse modelling have shown that maximum runup values can be estimated reasonably well. In this study we generate a simplistic bathymetry domains which resemble realistic near-shore features. We investigate the accuracy of the analytical runup formulae to the variation of fault source parameters and near-shore bathymetric features. To do this we systematically vary the fault plane parameters to compute the initial tsunami wave condition. Subsequently, we use the initial conditions to run the numerical tsunami model using coupled system of four nested grids and compare the results to the analytical estimates. Variation of the dip angle of the fault plane showed that analytical estimates have less than 10% difference for angles 5-45 degrees in a simple bathymetric domain. These results shows that the use of analytical formulae for fast run up estimates constitutes a very promising approach in a simple bathymetric domain and might be implemented in Hazard Mapping and Early Warning.

  3. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  4. Using Google Analytics to evaluate the impact of the CyberTraining project.

    Science.gov (United States)

    McGuckin, Conor; Crowley, Niall

    2012-11-01

    A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.

  5. Use of analytical aids for accident management

    International Nuclear Information System (INIS)

    Ward, L.W.

    1991-01-01

    The use of analytical aids by utility technical support teams can enhance the staff's ability to manage accidents. Since instrumentation is exposed to environments beyond design-basis conditions, instruments may provide ambiguous information or may even fail. While it is most likely that many instruments will remain operable, their ability to provide unambiguous information needed for the management of beyond-design-basis events and severe accidents is questionable. Furthermore, given these limitation in instrumentation, the need to ascertain and confirm current plant status and forecast future behavior to effectively manage accidents at nuclear facilities requires a computational capability to simulate the thermal and hydraulic behavior in the primary, secondary, and containment systems. With the need to extend the current preventive approach in accident management to include mitigative actions, analytical aids could be used to further enhance the current capabilities at nuclear facilities. This need for computational or analytical aids is supported based on a review of the candidate accident management strategies discussed in NUREG/CR-5474. Based on the review of the NUREG/CR-5474 strategies, two major analytical aids are considered necessary to support the implementation and monitoring of many of the strategies in this document. These analytical aids include (1) An analytical aid to provide reactor coolant and secondary system behavior under LOCA conditions. (2) An analytical aid to predict containment pressure and temperature response with a steam, air, and noncondensable gas mixture present

  6. Multifunctional nanoparticles: Analytical prospects

    International Nuclear Information System (INIS)

    Dios, Alejandro Simon de; Diaz-Garcia, Marta Elena

    2010-01-01

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifuncional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  7. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  8. Analytical model for Stirling cycle machine design

    Energy Technology Data Exchange (ETDEWEB)

    Formosa, F. [Laboratoire SYMME, Universite de Savoie, BP 80439, 74944 Annecy le Vieux Cedex (France); Despesse, G. [Laboratoire Capteurs Actionneurs et Recuperation d' Energie, CEA-LETI-MINATEC, Grenoble (France)

    2010-10-15

    In order to study further the promising free piston Stirling engine architecture, there is a need of an analytical thermodynamic model which could be used in a dynamical analysis for preliminary design. To aim at more realistic values, the models have to take into account the heat losses and irreversibilities on the engine. An analytical model which encompasses the critical flaws of the regenerator and furthermore the heat exchangers effectivenesses has been developed. This model has been validated using the whole range of the experimental data available from the General Motor GPU-3 Stirling engine prototype. The effects of the technological and operating parameters on Stirling engine performance have been investigated. In addition to the regenerator influence, the effect of the cooler effectiveness is underlined. (author)

  9. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  10. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  11. Analytical detection techniques for droplet microfluidics—A review

    International Nuclear Information System (INIS)

    Zhu, Ying; Fang, Qun

    2013-01-01

    Graphical abstract: -- Highlights: •This is the first review paper focused on the analytical techniques for droplet-based microfluidics. •We summarized the analytical methods used in droplet-based microfluidic systems. •We discussed the advantage and disadvantage of each method through its application. •We also discuss the future development direction of analytical methods for droplet-based microfluidic systems. -- Abstract: In the last decade, droplet-based microfluidics has undergone rapid progress in the fields of single-cell analysis, digital PCR, protein crystallization and high throughput screening. It has been proved to be a promising platform for performing chemical and biological experiments with ultra-small volumes (picoliter to nanoliter) and ultra-high throughput. The ability to analyze the content in droplet qualitatively and quantitatively is playing an increasing role in the development and application of droplet-based microfluidic systems. In this review, we summarized the analytical detection techniques used in droplet systems and discussed the advantage and disadvantage of each technique through its application. The analytical techniques mentioned in this paper include bright-field microscopy, fluorescence microscopy, laser induced fluorescence, Raman spectroscopy, electrochemistry, capillary electrophoresis, mass spectrometry, nuclear magnetic resonance spectroscopy, absorption detection, chemiluminescence, and sample pretreatment techniques. The importance of analytical detection techniques in enabling new applications is highlighted. We also discuss the future development direction of analytical detection techniques for droplet-based microfluidic systems

  12. Let's Not Forget: Learning Analytics Are about Learning

    Science.gov (United States)

    Gaševic, Dragan; Dawson, Shane; Siemens, George

    2015-01-01

    The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational…

  13. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  14. Analytical detection methods for irradiated foods

    International Nuclear Information System (INIS)

    1991-03-01

    The present publication is a review of scientific literature on the analytical identification of foods treated with ionizing radiation and the quantitative determination of absorbed dose of radiation. Because of the extremely low level of chemical changes resulting from irradiation or because of the lack of specificity to irradiation of any chemical changes, a few methods of quantitative determination of absorbed dose have shown promise until now. On the other hand, the present review has identified several possible methods, which could be used, following further research and testing, for the identification of irradiated foods. An IAEA Co-ordinated Research Programme on Analytical Detection Methods for Irradiation Treatment of Food ('ADMIT'), established in 1990, is currently investigating many of the methods cited in the present document. Refs and tab

  15. Manufacturing data analytics using a virtual factory representation.

    Science.gov (United States)

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  16. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  17. Aptamer-Based Analysis: A Promising Alternative for Food Safety Control

    Directory of Open Access Journals (Sweden)

    Sonia Amaya-González

    2013-11-01

    Full Text Available Ensuring food safety is nowadays a top priority of authorities and professional players in the food supply chain. One of the key challenges to determine the safety of food and guarantee a high level of consumer protection is the availability of fast, sensitive and reliable analytical methods to identify specific hazards associated to food before they become a health problem. The limitations of existing methods have encouraged the development of new technologies, among them biosensors. Success in biosensor design depends largely on the development of novel receptors with enhanced affinity to the target, while being stable and economical. Aptamers fulfill these characteristics, and thus have surfaced as promising alternatives to natural receptors. This Review describes analytical strategies developed so far using aptamers for the control of pathogens, allergens, adulterants, toxins and other forbidden contaminants to ensure food safety. The main progresses to date are presented, highlighting potential prospects for the future.

  18. Retail video analytics: an overview and survey

    Science.gov (United States)

    Connell, Jonathan; Fan, Quanfu; Gabbur, Prasad; Haas, Norman; Pankanti, Sharath; Trinh, Hoang

    2013-03-01

    Today retail video analytics has gone beyond the traditional domain of security and loss prevention by providing retailers insightful business intelligence such as store traffic statistics and queue data. Such information allows for enhanced customer experience, optimized store performance, reduced operational costs, and ultimately higher profitability. This paper gives an overview of various camera-based applications in retail as well as the state-ofthe- art computer vision techniques behind them. It also presents some of the promising technical directions for exploration in retail video analytics.

  19. Analytical methods used at model facility

    International Nuclear Information System (INIS)

    Wing, N.S.

    1984-01-01

    A description of analytical methods used at the model LEU Fuel Fabrication Facility is presented. The methods include gravimetric uranium analysis, isotopic analysis, fluorimetric analysis, and emission spectroscopy

  20. Untangling Slab Dynamics Using 3-D Numerical and Analytical Models

    Science.gov (United States)

    Holt, A. F.; Royden, L.; Becker, T. W.

    2016-12-01

    Increasingly sophisticated numerical models have enabled us to make significant strides in identifying the key controls on how subducting slabs deform. For example, 3-D models have demonstrated that subducting plate width, and the related strength of toroidal flow around the plate edge, exerts a strong control on both the curvature and the rate of migration of the trench. However, the results of numerical subduction models can be difficult to interpret, and many first order dynamics issues remain at least partially unresolved. Such issues include the dominant controls on trench migration, the interdependence of asthenospheric pressure and slab dynamics, and how nearby slabs influence each other's dynamics. We augment 3-D, dynamically evolving finite element models with simple, analytical force-balance models to distill the physics associated with subduction into more manageable parts. We demonstrate that for single, isolated subducting slabs much of the complexity of our fully numerical models can be encapsulated by simple analytical expressions. Rates of subduction and slab dip correlate strongly with the asthenospheric pressure difference across the subducting slab. For double subduction, an additional slab gives rise to more complex mantle pressure and flow fields, and significantly extends the range of plate kinematics (e.g., convergence rate, trench migration rate) beyond those present in single slab models. Despite these additional complexities, we show that much of the dynamics of such multi-slab systems can be understood using the physics illuminated by our single slab study, and that a force-balance method can be used to relate intra-plate stress to viscous pressure in the asthenosphere and coupling forces at plate boundaries. This method has promise for rapid modeling of large systems of subduction zones on a global scale.

  1. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  3. Recent trends in analytical procedures in forensic toxicology.

    Science.gov (United States)

    Van Bocxlaer, Jan F

    2005-12-01

    Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.

  4. Training the next generation analyst using red cell analytics

    Science.gov (United States)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  5. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  6. Pavement Performance : Approaches Using Predictive Analytics

    Science.gov (United States)

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  7. Sampling and analyte enrichment strategies for ambient mass spectrometry.

    Science.gov (United States)

    Li, Xianjiang; Ma, Wen; Li, Hongmei; Ai, Wanpeng; Bai, Yu; Liu, Huwei

    2018-01-01

    Ambient mass spectrometry provides great convenience for fast screening, and has showed promising potential in analytical chemistry. However, its relatively low sensitivity seriously restricts its practical utility in trace compound analysis. In this review, we summarize the sampling and analyte enrichment strategies coupled with nine modes of representative ambient mass spectrometry (desorption electrospray ionization, paper vhspray ionization, wooden-tip spray ionization, probe electrospray ionization, coated blade spray ionization, direct analysis in real time, desorption corona beam ionization, dielectric barrier discharge ionization, and atmospheric-pressure solids analysis probe) that have dramatically increased the detection sensitivity. We believe that these advances will promote routine use of ambient mass spectrometry. Graphical abstract Scheme of sampling stretagies for ambient mass spectrometry.

  8. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  9. Bio-analytical applications of mid-infrared spectroscopy using silver halide fiber-optic probes

    International Nuclear Information System (INIS)

    Heise, H.M.; Kuepper, L.; Butvina, L.N.

    2002-01-01

    Infrared-spectroscopy has proved to be a powerful method for the study of various biomedical samples, in particular for in-vitro analysis in the clinical laboratory and for non-invasive diagnostics. In general, the analysis of biofluids such as whole blood, urine, microdialysates and bioreactor broth media takes advantage of the fact that a multitude of analytes can be quantified simultaneously and rapidly without the need for reagents. Progress in the quality of infrared silver halide fibers enabled us to construct several flexible fiber-optic probes of different geometries, which are particularly suitable for the measurement of small biosamples. Recent trends show that dry film measurements by mid-infrared spectroscopy could revolutionize analytical tools in the clinical chemistry laboratory, and an example is given. Infrared diagnostic tools show a promising potential for patients, and minimal-invasive blood glucose assays or skin tissue pathology in particular cannot be left out using mid-infrared fiber-based probes. Other applications include the measurement of skin samples including penetration studies of vitamins and constituents of cosmetic cream formulations. A further field is the micro-domain analysis of biopsy samples from bog mummified corpses, and recent results on the chemistry of dermis and hair samples are reported. Another field of application, for which results are reported, is food analysis and bio-reactor monitoring

  10. Monte Carlo and analytical model predictions of leakage neutron exposures from passively scattered proton therapy

    International Nuclear Information System (INIS)

    Pérez-Andújar, Angélica; Zhang, Rui; Newhauser, Wayne

    2013-01-01

    Purpose: Stray neutron radiation is of concern after radiation therapy, especially in children, because of the high risk it might carry for secondary cancers. Several previous studies predicted the stray neutron exposure from proton therapy, mostly using Monte Carlo simulations. Promising attempts to develop analytical models have also been reported, but these were limited to only a few proton beam energies. The purpose of this study was to develop an analytical model to predict leakage neutron equivalent dose from passively scattered proton beams in the 100-250-MeV interval.Methods: To develop and validate the analytical model, the authors used values of equivalent dose per therapeutic absorbed dose (H/D) predicted with Monte Carlo simulations. The authors also characterized the behavior of the mean neutron radiation-weighting factor, w R , as a function of depth in a water phantom and distance from the beam central axis.Results: The simulated and analytical predictions agreed well. On average, the percentage difference between the analytical model and the Monte Carlo simulations was 10% for the energies and positions studied. The authors found that w R was highest at the shallowest depth and decreased with depth until around 10 cm, where it started to increase slowly with depth. This was consistent among all energies.Conclusion: Simple analytical methods are promising alternatives to complex and slow Monte Carlo simulations to predict H/D values. The authors' results also provide improved understanding of the behavior of w R which strongly depends on depth, but is nearly independent of lateral distance from the beam central axis

  11. Big data analytics as-a-service: Issues and challenges

    OpenAIRE

    Damiani, Ernesto; Ardagna, Claudio Agostino; Ceravolo, Paolo

    2017-01-01

    Big Data domain is one of the most promising ICT sectors with substantial expectations both on the side of market growing and design shift in the area of data storage managment and analytics. However, today, the level of complexity achieved and the lack of standardisation of Big Data management architectures represent a huge barrier towards the adoption and execution of analytics especially for those organizations and SMEs not including a sufficient amount of competences and knowledge. The fu...

  12. The Promises of Biology and the Biology of Promises

    DEFF Research Database (Denmark)

    Lee, Jieun

    2015-01-01

    commitments with differently imagined futures. I argue that promises are constitutive of the stem cell biology, rather than being derivative of it. Since the biological concept of stem cells is predicated on the future that they promise, the biological life of stem cells is inextricably intertwined...... patients’ bodies in anticipation of materializing the promises of stem cell biology, they are produced as a new form of biovaluable. The promises of biology move beyond the closed circuit of scientific knowledge production, and proliferate in the speculative marketplaces of promises. Part II looks at how...... of technologized biology and biological time can appear promising with the backdrop of the imagined intransigence of social, political, and economic order in the Korean society....

  13. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Science.gov (United States)

    Zhang, Sida; Liu, Wei; Zhang, Xiaohe; Duan, Yixiang

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements.

  14. Use of information technologies in teaching course "Analytical geometry" in higher schools on example of software "ANALYTICAL GEOMETRY"

    OpenAIRE

    V. B. Grigorieva

    2009-01-01

    In article are considered the methodical questions of using of computer technologies, for example, the software "Analytical geometry", in process of teaching course of analytical geometry in the higher school.

  15. Analytical Methods for Biomass Characterization during Pretreatment and Bioconversion

    Energy Technology Data Exchange (ETDEWEB)

    Pu, Yunqiao [ORNL; Meng, Xianzhi [University of Tennessee, Knoxville (UTK); Yoo, Chang Geun; Li, Mi; Ragauskas, Arthur J [ORNL

    2016-01-01

    Lignocellulosic biomass has been introduced as a promising resource for alternative fuels and chemicals because of its abundance and complement for petroleum resources. Biomass is a complex biopolymer and its compositional and structural characteristics largely vary depending on its species as well as growth environments. Because of complexity and variety of biomass, understanding its physicochemical characteristics is a key for effective biomass utilization. Characterization of biomass does not only provide critical information of biomass during pretreatment and bioconversion, but also give valuable insights on how to utilize the biomass. For better understanding biomass characteristics, good grasp and proper selection of analytical methods are necessary. This chapter introduces existing analytical approaches that are widely employed for biomass characterization during biomass pretreatment and conversion process. Diverse analytical methods using Fourier transform infrared (FTIR) spectroscopy, gel permeation chromatography (GPC), and nuclear magnetic resonance (NMR) spectroscopy for biomass characterization are reviewed. In addition, biomass accessibility methods by analyzing surface properties of biomass are also summarized in this chapter.

  16. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  17. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    Science.gov (United States)

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Investigation of Using Analytics in Promoting Mobile Learning Support

    Science.gov (United States)

    Visali, Videhi; Swami, Niraj

    2013-01-01

    Learning analytics can promote pedagogically informed use of learner data, which can steer the progress of technology mediated learning across several learning contexts. This paper presents the application of analytics to a mobile learning solution and demonstrates how a pedagogical sense was inferred from the data. Further, this inference was…

  19. An Analysis of Earth Science Data Analytics Use Cases

    Science.gov (United States)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  20. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  1. Using Linked Data in Learning Analytics

    NARCIS (Netherlands)

    d'Aquin, Mathieu; Dietze, Stefan; Herder, Eelco; Drachsler, Hendrik; Taibi, David

    2014-01-01

    Learning Analytics has a lot to do with data, and the way to make sense of raw data in terms of the learner’s experience, behaviour and knowledge. In this article, we argue about the need for a closer relationship between the field of Learning Analytics and the one of Linked Data, which in our view

  2. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    Science.gov (United States)

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  3. Using Learning Analytics to Assess Student Learning in Online Courses

    Science.gov (United States)

    Martin, Florence; Ndoye, Abdou

    2016-01-01

    Learning analytics can be used to enhance student engagement and performance in online courses. Using learning analytics, instructors can collect and analyze data about students and improve the design and delivery of instruction to make it more meaningful for them. In this paper, the authors review different categories of online assessments and…

  4. Enantioselectivity of mass spectrometry: challenges and promises.

    Science.gov (United States)

    Awad, Hanan; El-Aneed, Anas

    2013-01-01

    With the fast growing market of pure enantiomer drugs and bioactive molecules, new chiral-selective analytical tools have been instigated including the use of mass spectrometry (MS). Even though MS is one of the best analytical tools that has efficiently been used in several pharmaceutical and biological applications, traditionally MS is considered as a "chiral-blind" technique. This limitation is due to the MS inability to differentiate between two enantiomers of a chiral molecule based merely on their masses. Several approaches have been explored to assess the potential role of MS in chiral analysis. The first approach depends on the use of MS-hyphenated techniques utilizing fast and sensitive chiral separation tools such as liquid chromatography (LC), gas chromatography (GC), and capillary electrophoresis (CE) coupled to MS detector. More recently, several alternative separation techniques have been evaluated such as supercritical fluid chromatography (SFC) and capillary electrochromatography (CEC); the latter being a hybrid technique that combines the efficiency of CE with the selectivity of LC. The second approach is based on using the MS instrument solely for the chiral recognition. This method depends on the behavioral differences between enantiomers towards a foreign molecule and the ability of MS to monitor such differences. These behavioral differences can be divided into three types: (i) differences in the enantiomeric affinity for association with the chiral selector, (ii) differences of the enantiomeric exchange rate with a foreign reagent, and (iii) differences in the complex MS dissociation behaviors of the enantiomers. Most recently, ion mobility spectrometry was introduced to qualitatively and quantitatively evaluate chiral compounds. This article provides an overview of MS role in chiral analysis by discussing MS based methodologies and presenting the challenges and promises associated with each approach. © 2013 Wiley Periodicals, Inc.

  5. Analytical modeling of worldwide medical radiation use

    International Nuclear Information System (INIS)

    Mettler, F.A. Jr.; Davis, M.; Kelsey, C.A.; Rosenberg, R.; Williams, A.

    1987-01-01

    An analytical model was developed to estimate the availability and frequency of medical radiation use on a worldwide basis. This model includes medical and dental x-ray, nuclear medicine, and radiation therapy. The development of an analytical model is necessary as the first step in estimating the radiation dose to the world's population from this source. Since there is no data about the frequency of medical radiation use in more than half the countries in the world and only fragmentary data in an additional one-fourth of the world's countries, such a model can be used to predict the uses of medical radiation in these countries. The model indicates that there are approximately 400,000 medical x-ray machines worldwide and that approximately 1.2 billion diagnostic medical x-ray examinations are performed annually. Dental x-ray examinations are estimated at 315 million annually and approximately 22 million in-vivo diagnostic nuclear medicine examinations. Approximately 4 million radiation therapy procedures or courses of treatment are undertaken annually

  6. Plasma-cavity ringdown spectroscopy for analytical measurement: Progress and prospectives

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Sida; Liu, Wei [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China); Zhang, Xiaohe [College of Water Resources and Hydropower, Sichuan University, Chengdu (China); Duan, Yixiang, E-mail: yduan@scu.edu.cn [Research Center of Analytical Instrumentation, Analytical and Testing Center, College of Chemistry, Sichuan University, Chengdu (China)

    2013-07-01

    Plasma-cavity ringdown spectroscopy is a powerful absorption technique for analytical measurement. It combines the inherent advantages of high sensitivity, absolute measurement, and relative insensitivity to light source intensity fluctuations of the cavity ringdown technique with use of plasma as an atomization/ionization source. In this review, we briefly describe the background and principles of plasma-cavity ringdown spectroscopy(CRDS) technology, the instrumental components, and various applications. The significant developments of the plasma sources, lasers, and cavity optics are illustrated. Analytical applications of plasma-CRDS for elemental detection and isotopic measurement in atomic spectrometry are outlined in this review. Plasma-CRDS is shown to have a promising future for various analytical applications, while some further efforts are still needed in fields such as cavity design, plasma source design, instrumental improvement and integration, as well as potential applications in radical and molecular measurements. - Highlights: • Plasma-based cavity ringdown spectroscopy • High sensitivity and high resolution • Elemental and isotopic measurements.

  7. Seamless Digital Environment – Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-01

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.

  8. Fuzzy promises

    DEFF Research Database (Denmark)

    Anker, Thomas Boysen; Kappel, Klemens; Eadie, Douglas

    2012-01-01

    as narrative material to communicate self-identity. Finally, (c) we propose that brands deliver fuzzy experiential promises through effectively motivating consumers to adopt and play a social role implicitly suggested and facilitated by the brand. A promise is an inherently ethical concept and the article...... concludes with an in-depth discussion of fuzzy brand promises as two-way ethical commitments that put requirements on both brands and consumers....

  9. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  10. Analytic 3D image reconstruction using all detected events

    International Nuclear Information System (INIS)

    Kinahan, P.E.; Rogers, J.G.

    1988-11-01

    We present the results of testing a previously presented algorithm for three-dimensional image reconstruction that uses all gamma-ray coincidence events detected by a PET volume-imaging scanner. By using two iterations of an analytic filter-backprojection method, the algorithm is not constrained by the requirement of a spatially invariant detector point spread function, which limits normal analytic techniques. Removing this constraint allows the incorporation of all detected events, regardless of orientation, which improves the statistical quality of the final reconstructed image

  11. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  12. Online Learner Engagement: Opportunities and Challenges with Using Data Analytics

    Science.gov (United States)

    Bodily, Robert; Graham, Charles R.; Bush, Michael D.

    2017-01-01

    This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…

  13. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    Science.gov (United States)

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  14. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  15. Long-Term Prediction of Satellite Orbit Using Analytical Method

    Directory of Open Access Journals (Sweden)

    Jae-Cheol Yoon

    1997-12-01

    Full Text Available A long-term prediction algorithm of geostationary orbit was developed using the analytical method. The perturbation force models include geopotential upto fifth order and degree and luni-solar gravitation, and solar radiation pressure. All of the perturbation effects were analyzed by secular variations, short-period variations, and long-period variations for equinoctial elements such as the semi-major axis, eccentricity vector, inclination vector, and mean longitude of the satellite. Result of the analytical orbit propagator was compared with that of the cowell orbit propagator for the KOREASAT. The comparison indicated that the analytical solution could predict the semi-major axis with an accuarcy of better than ~35meters over a period of 3 month.

  16. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    International Nuclear Information System (INIS)

    Ahmadkhaniha, Reza; Shafiee, Abbas; Rastkari, Noushin; Kobarfard, Farzad

    2009-01-01

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis

  17. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  18. The Ethics of Using Learning Analytics to Categorize Students on Risk

    Science.gov (United States)

    Scholes, Vanessa

    2016-01-01

    There are good reasons for higher education institutions to use learning analytics to risk-screen students. Institutions can use learning analytics to better predict which students are at greater risk of dropping out or failing, and use the statistics to treat "risky" students differently. This paper analyses this practice using…

  19. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  20. Promising More Information

    Science.gov (United States)

    2003-01-01

    When NASA needed a real-time, online database system capable of tracking documentation changes in its propulsion test facilities, engineers at Stennis Space Center joined with ECT International, of Brookfield, Wisconsin, to create a solution. Through NASA's Dual-Use Program, ECT developed Exdata, a software program that works within the company's existing Promise software. Exdata not only satisfied NASA s requirements, but also expanded ECT s commercial product line. Promise, ECT s primary product, is an intelligent software program with specialized functions for designing and documenting electrical control systems. An addon to AutoCAD software, Promis e generates control system schematics, panel layouts, bills of material, wire lists, and terminal plans. The drawing functions include symbol libraries, macros, and automatic line breaking. Primary Promise customers include manufacturing companies, utilities, and other organizations with complex processes to control.

  1. Predictive analytics in mental health: applications, guidelines, challenges and perspectives.

    Science.gov (United States)

    Hahn, T; Nierenberg, A A; Whitfield-Gabrieli, S

    2017-01-01

    The emerging field of 'predictive analytics in mental health' has recently generated tremendous interest with the bold promise to revolutionize clinical practice in psychiatry paralleling similar developments in personalized and precision medicine. Here, we provide an overview of the key questions and challenges in the field, aiming to (1) propose general guidelines for predictive analytics projects in psychiatry, (2) provide a conceptual introduction to core aspects of predictive modeling technology, and (3) foster a broad and informed discussion involving all stakeholders including researchers, clinicians, patients, funding bodies and policymakers.

  2. Improving acute kidney injury diagnostics using predictive analytics.

    Science.gov (United States)

    Basu, Rajit K; Gist, Katja; Wheeler, Derek S

    2015-12-01

    Acute kidney injury (AKI) is a multifactorial syndrome affecting an alarming proportion of hospitalized patients. Although early recognition may expedite management, the ability to identify patients at-risk and those suffering real-time injury is inconsistent. The review will summarize the recent reports describing advancements in the area of AKI epidemiology, specifically focusing on risk scoring and predictive analytics. In the critical care population, the primary underlying factors limiting prediction models include an inability to properly account for patient heterogeneity and underperforming metrics used to assess kidney function. Severity of illness scores demonstrate limited AKI predictive performance. Recent evidence suggests traditional methods for detecting AKI may be leveraged and ultimately replaced by newer, more sophisticated analytical tools capable of prediction and identification: risk stratification, novel AKI biomarkers, and clinical information systems. Additionally, the utility of novel biomarkers may be optimized through targeting using patient context, and may provide more granular information about the injury phenotype. Finally, manipulation of the electronic health record allows for real-time recognition of injury. Integrating a high-functioning clinical information system with risk stratification methodology and novel biomarker yields a predictive analytic model for AKI diagnostics.

  3. A Model-Driven Methodology for Big Data Analytics-as-a-Service

    OpenAIRE

    Damiani, Ernesto; Ardagna, Claudio Agostino; Ceravolo, Paolo; Bellandi, Valerio; Bezzi, Michele; Hebert, Cedric

    2017-01-01

    The Big Data revolution has promised to build a data-driven ecosystem where better decisions are supported by enhanced analytics and data management. However, critical issues still need to be solved in the road that leads to commodization of Big Data Analytics, such as the management of Big Data complexity and the protection of data security and privacy. In this paper, we focus on the first issue and propose a methodology based on Model Driven Engineering (MDE) that aims to substantially lowe...

  4. The Use and Abuse of Limits of Detection in Environmental Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Richard J. C. Brown

    2008-01-01

    Full Text Available The limit of detection (LoD serves as an important method performance measure that is useful for the comparison of measurement techniques and the assessment of likely signal to noise performance, especially in environmental analytical chemistry. However, the LoD is only truly related to the precision characteristics of the analytical instrument employed for the analysis and the content of analyte in the blank sample. This article discusses how other criteria, such as sampling volume, can serve to distort the quoted LoD artificially and make comparison between various analytical methods inequitable. In order to compare LoDs between methods properly, it is necessary to state clearly all of the input parameters relating to the measurements that have been used in the calculation of the LoD. Additionally, the article discusses that the use of LoDs in contexts other than the comparison of the attributes of analytical methods, in particular when reporting analytical results, may be confusing, less informative than quoting the actual result with an accompanying statement of uncertainty, and may act to bias descriptive statistics.

  5. MIP-Based Sensors: Promising New Tools for Cancer Biomarker Determination

    Directory of Open Access Journals (Sweden)

    Giulia Selvolini

    2017-03-01

    Full Text Available Detecting cancer disease at an early stage is one of the most important issues for increasing the survival rate of patients. Cancer biomarker detection helps to provide a diagnosis before the disease becomes incurable in later stages. Biomarkers can also be used to evaluate the progression of therapies and surgery treatments. In recent years, molecularly imprinted polymer (MIP based sensors have been intensely investigated as promising analytical devices in several fields, including clinical analysis, offering desired portability, fast response, specificity, and low cost. The aim of this review is to provide readers with an overview on recent important achievements in MIP-based sensors coupled to various transducers (e.g., electrochemical, optical, and piezoelectric for the determination of cancer biomarkers by selected publications from 2012 to 2016.

  6. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  7. Data Analytics of Hydraulic Fracturing Data

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jovan Yang [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Viswanathan, Hari [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hyman, Jeffery [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Middleton, Richard [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-11

    These are a set of slides on the data analytics of hydraulic fracturing data. The conclusions from this research are the following: they proposed a permeability evolution as a new mechanism to explain hydraulic fracturing trends; they created a model to include this mechanism and it showed promising results; the paper from this research is ready for submission; they devised a way to identify and sort refractures in order to study their effects, and this paper is currently being written.

  8. Analytic Methods Used in Quality Control in a Compounding Pharmacy.

    Science.gov (United States)

    Allen, Loyd V

    2017-01-01

    Analytical testing will no doubt become a more important part of pharmaceutical compounding as the public and regulatory agencies demand increasing documentation of the quality of compounded preparations. Compounding pharmacists must decide what types of testing and what amount of testing to include in their quality-control programs, and whether testing should be done in-house or outsourced. Like pharmaceutical compounding, analytical testing should be performed only by those who are appropriately trained and qualified. This article discusses the analytical methods that are used in quality control in a compounding pharmacy. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  9. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  10. Using Learning Analytics for Preserving Academic Integrity

    Science.gov (United States)

    Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena

    2017-01-01

    This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…

  11. Analytical Evaluation of the Performance of Proportional Fair Scheduling in OFDMA-Based Wireless Systems

    Directory of Open Access Journals (Sweden)

    Mohamed H. Ahmed

    2012-01-01

    Full Text Available This paper provides an analytical evaluation of the performance of proportional fair (PF scheduling in Orthogonal Frequency-Division Multiple Access (OFDMA wireless systems. OFDMA represents a promising multiple access scheme for transmission over wireless channels, as it combines the orthogonal frequency division multiplexing (OFDM modulation and subcarrier allocation. On the other hand, the PF scheduling is an efficient resource allocation scheme with good fairness characteristics. Consequently, OFDMA with PF scheduling represents an attractive solution to deliver high data rate services to multiple users simultaneously with a high degree of fairness. We investigate a two-dimensional (time slot and frequency subcarrier PF scheduling algorithm for OFDMA systems and evaluate its performance analytically and by simulations. We derive approximate closed-form expressions for the average throughput, throughput fairness index, and packet delay. Computer simulations are used for verification. The analytical results agree well with the results from simulations, which show the good accuracy of the analytical expressions.

  12. Analytical local electron-electron interaction model potentials for atoms

    International Nuclear Information System (INIS)

    Neugebauer, Johannes; Reiher, Markus; Hinze, Juergen

    2002-01-01

    Analytical local potentials for modeling the electron-electron interaction in an atom reduce significantly the computational effort in electronic structure calculations. The development of such potentials has a long history, but some promising ideas have not yet been taken into account for further improvements. We determine a local electron-electron interaction potential akin to those suggested by Green et al. [Phys. Rev. 184, 1 (1969)], which are widely used in atom-ion scattering calculations, electron-capture processes, and electronic structure calculations. Generalized Yukawa-type model potentials are introduced. This leads, however, to shell-dependent local potentials, because the origin behavior of such potentials is different for different shells as has been explicated analytically [J. Neugebauer, M. Reiher, and J. Hinze, Phys. Rev. A 65, 032518 (2002)]. It is found that the parameters that characterize these local potentials can be interpolated and extrapolated reliably for different nuclear charges and different numbers of electrons. The analytical behavior of the corresponding localized Hartree-Fock potentials at the origin and at long distances is utilized in order to reduce the number of fit parameters. It turns out that the shell-dependent form of Green's potential, which we also derive, yields results of comparable accuracy using only one shell-dependent parameter

  13. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  14. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    Science.gov (United States)

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  15. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  16. Prioritization of Programmer's Productivity Using Analytic Hierarchy ...

    African Journals Online (AJOL)

    This paper focuses on the application of Analytic Hierarchy Process (AHP) model in the context of prioritizing programmer's productivity in University of Benin, Benin City Nigeria. This is achieved by evaluating the way in which the AHP model can be used to select the best programmer for the purpose of developing software ...

  17. A shipboard comparison of analytic methods for ballast water compliance monitoring

    Science.gov (United States)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  18. Analytical solution of electrohydrodynamic flow and transport in rectangular channels: inclusion of double layer effects

    KAUST Repository

    Joekar-Niasar, V.

    2013-01-25

    Upscaling electroosmosis in porous media is a challenge due to the complexity and scale-dependent nonlinearities of this coupled phenomenon. "Pore-network modeling" for upscaling electroosmosis from pore scale to Darcy scale can be considered as a promising approach. However, this method requires analytical solutions for flow and transport at pore scale. This study concentrates on the development of analytical solutions of flow and transport in a single rectangular channel under combined effects of electrohydrodynamic forces. These relations will be used in future works for pore-network modeling. The analytical solutions are valid for all regimes of overlapping electrical double layers and have the potential to be extended to nonlinear Boltzmann distribution. The innovative aspects of this study are (a) contribution of overlapping of electrical double layers to the Stokes flow as well as Nernst-Planck transport has been carefully included in the analytical solutions. (b) All important transport mechanisms including advection, diffusion, and electromigration have been included in the analytical solutions. (c) Fully algebraic relations developed in this study can be easily employed to upscale electroosmosis to Darcy scale using pore-network modeling. © 2013 Springer Science+Business Media Dordrecht.

  19. Analytical solution of electrohydrodynamic flow and transport in rectangular channels: inclusion of double layer effects

    KAUST Repository

    Joekar-Niasar, V.; Schotting, R.; Leijnse, A.

    2013-01-01

    Upscaling electroosmosis in porous media is a challenge due to the complexity and scale-dependent nonlinearities of this coupled phenomenon. "Pore-network modeling" for upscaling electroosmosis from pore scale to Darcy scale can be considered as a promising approach. However, this method requires analytical solutions for flow and transport at pore scale. This study concentrates on the development of analytical solutions of flow and transport in a single rectangular channel under combined effects of electrohydrodynamic forces. These relations will be used in future works for pore-network modeling. The analytical solutions are valid for all regimes of overlapping electrical double layers and have the potential to be extended to nonlinear Boltzmann distribution. The innovative aspects of this study are (a) contribution of overlapping of electrical double layers to the Stokes flow as well as Nernst-Planck transport has been carefully included in the analytical solutions. (b) All important transport mechanisms including advection, diffusion, and electromigration have been included in the analytical solutions. (c) Fully algebraic relations developed in this study can be easily employed to upscale electroosmosis to Darcy scale using pore-network modeling. © 2013 Springer Science+Business Media Dordrecht.

  20. Making Mass Spectrometry See the Light: The Promises and Challenges of Cryogenic Infrared Ion Spectroscopy as a Bioanalytical Technique.

    Science.gov (United States)

    Cismesia, Adam P; Bailey, Laura S; Bell, Matthew R; Tesler, Larry F; Polfer, Nicolas C

    2016-05-01

    The detailed chemical information contained in the vibrational spectrum of a cryogenically cooled analyte ion would, in principle, make infrared (IR) ion spectroscopy a gold standard technique for molecular identification in mass spectrometry. Despite this immense potential, there are considerable challenges in both instrumentation and methodology to overcome before the technique is analytically useful. Here, we discuss the promise of IR ion spectroscopy for small molecule analysis in the context of metabolite identification. Experimental strategies to address sensitivity constraints, poor overall duty cycle, and speed of the experiment are intimately tied to the development of a mass-selective cryogenic trap. Therefore, the most likely avenues for success, in the authors' opinion, are presented here, alongside alternative approaches and some thoughts on data interpretation.

  1. When a desired home death does not occur: the consequences of broken promises.

    Science.gov (United States)

    Topf, Lorrianne; Robinson, Carole A; Bottorff, Joan L

    2013-08-01

    Evidence shows that most people prefer to die at home; however, the majority of expected deaths occur away from home. Although home deaths require family caregiver (FCG) commitment and care, we understand very little about their experiences in this context. The study's aim was to gain a better understanding of the experiences of FCGs when circumstances prevented a desired home death for a family member with advanced cancer. An interpretive description approach was used. Data collection involved semistructured interviews. Field notes and reflective journaling aided interpretive and analytical processes. The study was conducted in western Canada and included 18 bereaved FCGs. FCGs were committed to the promises made to care for their family member at home until death. These promises were challenged by a lack of preparedness for caregiving, difficulty accessing professional support and information, and frustration with the inadequate help they received. The events that precipitated dying family members leaving their home for hospital or hospice were unexpected and often influenced by FCGs' lack of situation-specific knowledge and ability to cope with complex caregiving responsibilities. FCGs found it extremely challenging to reconcile with breaking their promise to care at home until death and many were unable to do so. FCGs' despair about not being able to keep their promise for a home death was related to complicated bereavement. Prospective studies of the experiences of FCGs who are aiming for home deaths are needed to identify both short- and long-term interventions to effectively support death at home.

  2. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    Science.gov (United States)

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  3. Recovery of environmental analytes from clays and soils by supercritical fluid extracting/gas chromatography

    International Nuclear Information System (INIS)

    Emery, A.P.; Chesler, S.N.; MacCrehan, W.A.

    1992-01-01

    This paper reports on Supercritical Fluid Extraction (SFE) which promises to provide rapid extractions of organic analytes from environmental sample types without the use of hazardous solvents. In addition, SFE protocols using commercial instrumentation can be automated lowering analysis costs. Because of these benefits, we are investigating SFE as an alternative to the solvent extraction (eg. Soxhlet and sonication) techniques required in many EPA test procedures. SFE, using non-polar carbon dioxide as well as more polar supercritical fluids, was used to determine n-alkane hydrocarbons and polynuclear aromatic hydrocarbons (PAHs) in solid samples. The extraction behavior of these analyte classes from environmentally-contaminated soil matrices and model soil and clay matrices was investigated using a SFE apparatus in which the extracted analytes were collected on a solid phase trap and then selectively eluted with a solvent. The SFE conditions for quantitative recovery of n-alkane hydrocarbons in diesel fuel from a series of clays and soils were determined using materials prepared at the 0.02% level with diesel fuel oil in order to simplify analyte collection and analysis after extraction. The effect of extraction parameters including temperature, fluid flow rate and modifier addition were investigated by monitoring the amount of diesel fuel extracted as a function of time

  4. Machine learning for Big Data analytics in plants.

    Science.gov (United States)

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Sample diagnosis using indicator elements and non-analyte signals for inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Antler, Margaret; Ying Hai; Burns, David H.; Salin, Eric D.

    2003-01-01

    A sample diagnosis procedure that uses both non-analyte and analyte signals to estimate matrix effects in inductively coupled plasma-mass spectrometry is presented. Non-analyte signals are those of background species in the plasma (e.g. N + , ArO + ), and changes in these signals can indicate changes in plasma conditions. Matrix effects of Al, Ba, Cs, K and Na on 19 non-analyte signals and 15 element signals were monitored. Multiple linear regression was used to build the prediction models, using a genetic algorithm for objective feature selection. Non-analyte elemental signals and non-analyte signals were compared for diagnosing matrix effects, and both were found to be suitable for estimating matrix effects. Individual analyte matrix effect estimation was compared with the overall matrix effect prediction, and models used to diagnose overall matrix effects were more accurate than individual analyte models. In previous work [Spectrochim. Acta Part B 57 (2002) 277], we tested models for analytical decision making. The current models were tested in the same way, and were able to successfully diagnose matrix effects with at least an 80% success rate

  6. Sharing the Data along with the Responsibility: Examining an Analytic Scale-Based Model for Assessing School Climate.

    Science.gov (United States)

    Shindler, John; Taylor, Clint; Cadenas, Herminia; Jones, Albert

    This study was a pilot effort to examine the efficacy of an analytic trait scale school climate assessment instrument and democratic change system in two urban high schools. Pilot study results indicate that the instrument shows promising soundness in that it exhibited high levels of validity and reliability. In addition, the analytic trait format…

  7. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadkhaniha, Reza; Shafiee, Abbas [Department of Medicinal Chemistry, Faculty of Pharmacy and Pharmaceutical Sciences Research Center, Tehran University of Medical Sciences, Tehran 14174 (Iran, Islamic Republic of); Rastkari, Noushin [Center for Environmental Research, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kobarfard, Farzad [Department of Medicinal Chemistry, School of Pharmacy, Shaheed Beheshti University of Medical Sciences, Tavaneer Ave., Valieasr St., Tehran (Iran, Islamic Republic of)], E-mail: farzadkf@yahoo.com

    2009-01-05

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis.

  8. Medical big data: promise and challenges.

    Science.gov (United States)

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  9. Medical big data: promise and challenges

    Directory of Open Access Journals (Sweden)

    Choong Ho Lee

    2017-03-01

    Full Text Available The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  10. Affordances and Limitations of Learning Analytics for Computer-Assisted Language Learning: A Case Study of the VITAL Project

    Science.gov (United States)

    Gelan, Anouk; Fastré, Greet; Verjans, Martine; Martin, Niels; Janssenswillen, Gert; Creemers, Mathijs; Lieben, Jonas; Depaire, Benoît; Thomas, Michael

    2018-01-01

    Learning analytics (LA) has emerged as a field that offers promising new ways to prevent drop-out and aid retention. However, other research suggests that large datasets of learner activity can be used to understand online learning behaviour and improve pedagogy. While the use of LA in language learning has received little attention to date,…

  11. Neuroimaging in psychiatric pharmacogenetics research: the promise and pitfalls.

    Science.gov (United States)

    Falcone, Mary; Smith, Ryan M; Chenoweth, Meghan J; Bhattacharjee, Abesh Kumar; Kelsoe, John R; Tyndale, Rachel F; Lerman, Caryn

    2013-11-01

    The integration of research on neuroimaging and pharmacogenetics holds promise for improving treatment for neuropsychiatric conditions. Neuroimaging may provide a more sensitive early measure of treatment response in genetically defined patient groups, and could facilitate development of novel therapies based on an improved understanding of pathogenic mechanisms underlying pharmacogenetic associations. This review summarizes progress in efforts to incorporate neuroimaging into genetics and treatment research on major psychiatric disorders, such as schizophrenia, major depressive disorder, bipolar disorder, attention-deficit/hyperactivity disorder, and addiction. Methodological challenges include: performing genetic analyses in small study populations used in imaging studies; inclusion of patients with psychiatric comorbidities; and the extensive variability across studies in neuroimaging protocols, neurobehavioral task probes, and analytic strategies. Moreover, few studies use pharmacogenetic designs that permit testing of genotype × drug effects. As a result of these limitations, few findings have been fully replicated. Future studies that pre-screen participants for genetic variants selected a priori based on drug metabolism and targets have the greatest potential to advance the science and practice of psychiatric treatment.

  12. Evaluating Modeling Sessions Using the Analytic Hierarchy Process

    NARCIS (Netherlands)

    Ssebuggwawo, D.; Hoppenbrouwers, S.J.B.A.; Proper, H.A.; Persson, A.; Stirna, J.

    2008-01-01

    In this paper, which is methodological in nature, we propose to use an established method from the field of Operations Research, the Analytic Hierarchy Process (AHP), in the integrated, stakeholder- oriented evaluation of enterprise modeling sessions: their language, pro- cess, tool (medium), and

  13. Trace detection of analytes using portable raman systems

    Science.gov (United States)

    Alam, M. Kathleen; Hotchkiss, Peter J.; Martin, Laura E.; Jones, David Alexander

    2015-11-24

    Apparatuses and methods for in situ detection of a trace amount of an analyte are disclosed herein. In a general embodiment, the present disclosure provides a surface-enhanced Raman spectroscopy (SERS) insert including a passageway therethrough, where the passageway has a SERS surface positioned therein. The SERS surface is configured to adsorb molecules of an analyte of interest. A concentrated sample is caused to flow over the SERS surface. The SERS insert is then provided to a portable Raman spectroscopy system, where it is analyzed for the analyte of interest.

  14. Mastering JavaScript promises

    CERN Document Server

    Hussain, Muzzamil

    2015-01-01

    This book is for all the software and web engineers wanting to apply the promises paradigm to their next project and get the best outcome from it. This book also acts as a reference for the engineers who are already using promises in their projects and want to improve their current knowledge to reach the next level. To get the most benefit from this book, you should know basic programming concepts, have a familiarity with JavaScript, and a good understanding of HTML.

  15. Homogenized blocked arcs for multicriteria optimization of radiotherapy: Analytical and numerical solutions

    International Nuclear Information System (INIS)

    Fenwick, John D.; Pardo-Montero, Juan

    2010-01-01

    appropriate and useful for computing homogenized blocked arcs, as it produces better dose-distributions than the analytic approaches and their obvious extensions, and can more straightforwardly be used to generate homogenized arcs for concave OARs. However, the analytical solutions provide promising starting points for the iterative algorithm, leading to fast convergence.

  16. Spectral interference of zirconium on 24 analyte elements using CCD based ICP-AES technique

    International Nuclear Information System (INIS)

    Adya, V.C.; Sengupta, Arijit; Godbole, S.V.

    2014-01-01

    In the present studies, the spectral interference of zirconium on different analytical lines of 24 critical analytes using CCD based ICP-AES technique is described. Suitable analytical lines for zirconium were identified along with their detection limits. The sensitivity and the detection limits of analytical channels for different elements in presence of Zr matrix were calculated. Subsequently analytical lines with least interference from Zr and better detection limits were selected for their determinations. (author)

  17. Pitfalls and Promises: The Use of Secondary Data Analysis in Educational Research

    Science.gov (United States)

    Smith, Emma

    2008-01-01

    This paper considers the use of secondary data analysis in educational research. It addresses some of the promises and potential pitfalls that influence its use and explores a possible role for the secondary analysis of numeric data in the "new" political arithmetic tradition of social research. Secondary data analysis is a relatively under-used…

  18. Do promises matter? An exploration of the role of promises in psychological contract breach.

    Science.gov (United States)

    Montes, Samantha D; Zweig, David

    2009-09-01

    Promises are positioned centrally in the study of psychological contract breach and are argued to distinguish psychological contracts from related constructs, such as employee expectations. However, because the effects of promises and delivered inducements are confounded in most research, the role of promises in perceptions of, and reactions to, breach remains unclear. If promises are not an important determinant of employee perceptions, emotions, and behavioral intentions, this would suggest that the psychological contract breach construct might lack utility. To assess the unique role of promises, the authors manipulated promises and delivered inducements separately in hypothetical scenarios in Studies 1 (558 undergraduates) and 2 (441 employees), and they measured them separately (longitudinally) in Study 3 (383 employees). The authors' results indicate that breach perceptions do not represent a discrepancy between what employees believe they were promised and were given. In fact, breach perceptions can exist in the absence of promises. Further, promises play a negligible role in predicting feelings of violation and behavioral intentions. Contrary to the extant literature, the authors' findings suggest that promises may matter little; employees are concerned primarily with what the organization delivers.

  19. Mapping debris flow susceptibility using analytical network process ...

    Indian Academy of Sciences (India)

    Evangelin Ramani Sujatha

    2017-11-23

    Nov 23, 2017 ... methods known as the analytical network process (ANP) is used to map the ..... ciated in any prospective way, through feedbacks ..... slide susceptibility by means of multivariate statistical .... and bivariate statistics: A case study in southern Italy;. Nat. ... combination applied to Tevankarai Stream Watershed,.

  20. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  1. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  2. Combination of Cyclodextrin and Ionic Liquid in Analytical Chemistry: Current and Future Perspectives.

    Science.gov (United States)

    Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah

    2017-09-03

    The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.

  3. PROGRESSIVE DATA ANALYTICS IN HEALTH INFORMATICS USING AMAZON ELASTIC MAPREDUCE (EMR

    Directory of Open Access Journals (Sweden)

    J S Shyam Mohan

    2016-04-01

    Full Text Available Identifying, diagnosing and treatment of cancer involves a thorough investigation that involves data collection called big data from multi and different sources that are helpful for making effective and quick decision making. Similarly data analytics is used to find remedial actions for newly arriving diseases spread across multiple warehouses. Analytics can be performed on collected or available data from various data clusters that contains pieces of data. We provide an effective framework that provides a way for effective decision making using Amazon EMR. Through various experiments done on different biological datasets, we reveal the advantages of the proposed model and present numerical results. These results indicate that the proposed framework can efficiently perform analytics over any biological datasets and obtain results in optimal time thereby maintaining the quality of the result.

  4. 77 FR 56176 - Analytical Methods Used in Periodic Reporting

    Science.gov (United States)

    2012-09-12

    ... informal rulemaking proceeding to consider changes in analytical principles (Proposals Six and Seven) used... (Proposals Six and Seven), September 4, 2012 (Petition). Proposal Six: Use of Foreign Postal Settlement System as Sole Source for Reporting of Inbound International Revenue, Pieces, and Weights. The Postal...

  5. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  6. BIG DATA ANALYTICS USE IN CUSTOMER RELATIONSHIP MANAGEMENT: ANTECEDENTS AND PERFORMANCE IMPLICATIONS

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2016-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study aims to (1) determine whether organizational BD use improves customer-centric and financial outcomes, and (2) identify the factors influencing BD use. Drawing primarily from market...

  7. Seamless Digital Environment - Plan for Data Analytics Use Case Study

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    2016-01-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a ''one stop shop'' application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the ''siloed'' data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team's control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  8. Elicited vs. voluntary promises

    NARCIS (Netherlands)

    Ismayilov, H.; Potters, Jan

    2017-01-01

    We set up an experiment with pre-play communication to study the impact of promise elicitation by trustors from trustees on trust and trustworthiness. When given the opportunity a majority of trustors solicits a promise from the trustee. This drives up the promise making rate by trustees to almost

  9. The use of analytical procedures in the internal audit of the restaurant business expenses

    Directory of Open Access Journals (Sweden)

    T.Yu. Kopotienko

    2015-06-01

    Full Text Available The important task of carrying out the internal audit of expenses is to get the sufficient and reliable audit evidence. This can be achieved by using the analytical procedures in the audit process. The identification of the analytical procedures with the financial analysis of the business activities prevents from the efficient usage of them in the internal audit of the restaurant business expenses. The knowledge of internal auditors about the instructional techniques of analytical procedures and their tasks, depending on the verification steps are insufficient. The purpose of the article is the developing the methods of the internal audit of the restaurant business expenses based on an integrated application of analytical procedures. The nature and purpose of analytical procedures have been investigated in the article. It have been identified the factors influencing on auditor’s decision about the choice of analytical procedures complex. It was recommended to identify among them the purpose of analytical procedures, the type and structure of the enterprise, the source of the available information, the existence of financial and non-financial information, reliability and comparability of the available information. It have been identified the tasks of analytical procedures, depending on the verification steps. It was offered the analytical procedures complex as a part of internal audit of the restaurant business expenses. This complex contains a list of the analytical procedures, instructional techniques of analysis that are used in the appropriate procedure and the brief overview of the content of procedure.

  10. Big Data Analytics Solutions: The Implementation Challenges in the Financial Services Industry

    Science.gov (United States)

    Ojo, Michael O.

    2016-01-01

    The challenges of Big Data (BD) and Big Data Analytics (BDA) have attracted disproportionately less attention than the overwhelmingly espoused benefits and game-changing promises. While many studies have examined BD challenges across multiple industry verticals, very few have focused on the challenges of implementing BDA solutions. Fewer of these…

  11. Phosphorus-containing azo compounds as analytical reagents for beryllium

    International Nuclear Information System (INIS)

    Lisenko, N.F.; Dolzhnikova, E.N.; Petrova, G.S.; Tsvetkov, E.N.; Vsesoyuznyj Nauchno-Issledovatel'skij Inst. Khimicheskikh Reaktivov i Osobo Chistykh Veshchestv, Moscow; AN SSSR, Moscow. Inst. Ehlementoorganicheskikh Soedinenij)

    1979-01-01

    The interaction of beryllium with six new azocompounds based on chromotropic or R-acids and o-aminophenyl-phenylphosphonic acids is studied. A sharp difference in the detection limit for beryllium by the two groups of compounds is found. Azoderivatives based on chromotropic acid are promising agent for beryllium due to sufficiently high selectivity. The introduction of the methyl-group into the o-position of the phosphorus-containing group improves the analytical properties of agents. Techniques are developed for the determination of beryllium in bronze, sewage water and in an artificial mixture using a sodium salt of 1.8-dioxi-2 [2' - (oxi- (o-methylphenyl)-phosphenyl)-phenilazo]-naphtalene-3.6-disulfoacid

  12. GANViz: A Visual Analytics Approach to Understand the Adversarial Game.

    Science.gov (United States)

    Wang, Junpeng; Gou, Liang; Yang, Hao; Shen, Han-Wei

    2018-06-01

    Generative models bear promising implications to learn data representations in an unsupervised fashion with deep learning. Generative Adversarial Nets (GAN) is one of the most popular frameworks in this arena. Despite the promising results from different types of GANs, in-depth understanding on the adversarial training process of the models remains a challenge to domain experts. The complexity and the potential long-time training process of the models make it hard to evaluate, interpret, and optimize them. In this work, guided by practical needs from domain experts, we design and develop a visual analytics system, GANViz, aiming to help experts understand the adversarial process of GANs in-depth. Specifically, GANViz evaluates the model performance of two subnetworks of GANs, provides evidence and interpretations of the models' performance, and empowers comparative analysis with the evidence. Through our case studies with two real-world datasets, we demonstrate that GANViz can provide useful insight into helping domain experts understand, interpret, evaluate, and potentially improve GAN models.

  13. Metal-organic frameworks for analytical chemistry: from sample collection to chromatographic separation.

    Science.gov (United States)

    Gu, Zhi-Yuan; Yang, Cheng-Xiong; Chang, Na; Yan, Xiu-Ping

    2012-05-15

    In modern analytical chemistry researchers pursue novel materials to meet analytical challenges such as improvements in sensitivity, selectivity, and detection limit. Metal-organic frameworks (MOFs) are an emerging class of microporous materials, and their unusual properties such as high surface area, good thermal stability, uniform structured nanoscale cavities, and the availability of in-pore functionality and outer-surface modification are attractive for diverse analytical applications. This Account summarizes our research on the analytical applications of MOFs ranging from sampling to chromatographic separation. MOFs have been either directly used or engineered to meet the demands of various analytical applications. Bulk MOFs with microsized crystals are convenient sorbents for direct application to in-field sampling and solid-phase extraction. Quartz tubes packed with MOF-5 have shown excellent stability, adsorption efficiency, and reproducibility for in-field sampling and trapping of atmospheric formaldehyde. The 2D copper(II) isonicotinate packed microcolumn has demonstrated large enhancement factors and good shape- and size-selectivity when applied to on-line solid-phase extraction of polycyclic aromatic hydrocarbons in water samples. We have explored the molecular sieving effect of MOFs for the efficient enrichment of peptides with simultaneous exclusion of proteins from biological fluids. These results show promise for the future of MOFs in peptidomics research. Moreover, nanosized MOFs and engineered thin films of MOFs are promising materials as novel coatings for solid-phase microextraction. We have developed an in situ hydrothermal growth approach to fabricate thin films of MOF-199 on etched stainless steel wire for solid-phase microextraction of volatile benzene homologues with large enhancement factors and wide linearity. Their high thermal stability and easy-to-engineer nanocrystals make MOFs attractive as new stationary phases to fabricate MOF

  14. Learning Analytics and Digital Badges: Potential Impact on Student Retention in Higher Education

    Science.gov (United States)

    Mah, Dana-Kristin

    2016-01-01

    Learning analytics and digital badges are emerging research fields in educational science. They both show promise for enhancing student retention in higher education, where withdrawals prior to degree completion remain at about 30% in Organisation for Economic Cooperation and Development member countries. This integrative review provides an…

  15. Review of analytical models to stream depletion induced by pumping: Guide to model selection

    Science.gov (United States)

    Huang, Ching-Sheng; Yang, Tao; Yeh, Hund-Der

    2018-06-01

    Stream depletion due to groundwater extraction by wells may cause impact on aquatic ecosystem in streams, conflict over water rights, and contamination of water from irrigation wells near polluted streams. A variety of studies have been devoted to addressing the issue of stream depletion, but a fundamental framework for analytical modeling developed from aquifer viewpoint has not yet been found. This review shows key differences in existing models regarding the stream depletion problem and provides some guidelines for choosing a proper analytical model in solving the problem of concern. We introduce commonly used models composed of flow equations, boundary conditions, well representations and stream treatments for confined, unconfined, and leaky aquifers. They are briefly evaluated and classified according to six categories of aquifer type, flow dimension, aquifer domain, stream representation, stream channel geometry, and well type. Finally, we recommend promising analytical approaches that can solve stream depletion problem in reality with aquifer heterogeneity and irregular geometry of stream channel. Several unsolved stream depletion problems are also recommended.

  16. Procedure for hazards analysis of plutonium gloveboxes used in analytical chemistry operations

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1977-06-01

    A procedure is presented to identify and assess hazards associated with gloveboxes used for analytical chemistry operations involving plutonium. This procedure is based upon analytic tree methodology and it has been adapted from the US Energy Research and Development Administration's safety program, the Management Oversight and Risk Tree

  17. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  18. Binding assays with streptavidin-functionalized superparamagnetic nanoparticles and biotinylated analytes using fluxgate magnetorelaxometry

    International Nuclear Information System (INIS)

    Heim, Erik; Ludwig, Frank; Schilling, Meinhard

    2009-01-01

    Binding assays based on the magnetorelaxation of superparamagnetic nanoparticles as markers are presented utilizing a differential fluxgate system. As ligand and receptor, streptavidin and biotin, respectively, are used. Superparamagnetic nanoparticles are functionalized with streptavidin and bound to two types of biotinylated analytes: agarose beads and bovine serum (BSA) proteins. The size difference of the two analytes causes a different progress of the reaction. As a consequence, the analysis of the relaxation signal is carried out dissimilarly for the two analytes. In addition, we studied the reaction kinetics of the two kinds of analytes with the fluxgate system.

  19. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  20. Multiattribute Supplier Selection Using Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Serhat Aydin

    2010-11-01

    Full Text Available Supplier selection is a multiattribute decision making (MADM problem which contains both qualitative and quantitative factors. Supplier selection has vital importance for most companies. The aim of this paper is to provide an AHP based analytical tool for decision support enabling an effective multicriteria supplier selection process in an air conditioner seller firm under fuzziness. In this article, the Analytic Hierarchy Process (AHP under fuzziness is employed for its permissiveness to use an evaluation scale including linguistic expressions, crisp numerical values, fuzzy numbers and range numerical values. This scale provides a more flexible evaluation compared with the other fuzzy AHP methods. In this study, the modified AHP was used in supplier selection in an air conditioner firm. Three experts evaluated the suppliers according to the proposed model and the most appropriate supplier was selected. The proposed model enables decision makers select the best supplier among supplier firms effectively. We confirm that the modified fuzzy AHP is appropriate for group decision making in supplier selection problems.

  1. Determination of uranium in ground water using different analytical techniques

    International Nuclear Information System (INIS)

    Sahu, S.K.; Maity, Sukanta; Bhangare, R.C.; Pandit, G.G.; Sharma, D.N.

    2014-10-01

    The concern over presence of natural radionuclides like uranium in drinking water is growing recently. The contamination of aquifers with radionuclides depends on number of factors. The geology of an area is the most important factor along with anthropogenic activities like mining, coal ash disposal from thermal power plants, use of phosphate fertilizers etc. Whatever may be the source, the presence of uranium in drinking waters is a matter of great concern for public health. Studies show that uranium is a chemo-toxic and nephrotoxic heavy metal. This chemotoxicity affects the kidneys and bones in particular. Seeing the potential health hazards from natural radionuclides in drinking water, many countries worldwide have adopted the guideline activity concentration for drinking water quality recommended by the WHO (2011). For uranium, WHO has set a limit of 30μgL-1 in drinking water. The geological distribution of uranium and its migration in environment is of interest because the element is having environmental and exposure concerns. It is of great interest to use an analytical technique for uranium analysis in water which is highly sensitive especially at trace levels, specific and precise in presence of other naturally occurring major and trace metals and needs small amount of sample. Various analytical methods based on the use of different techniques have been developed in the past for the determination of uranium in the geological samples. The determination of uranium requires high selectivity due to its strong association with other elements. Several trace level wet chemistry analytical techniques have been reported for uranium determination, but most of these involve tedious and pain staking procedures, high detection limits, interferences etc. Each analytical technique has its own merits and demerits. Comparative assessment by different techniques can provide better quality control and assurance. In present study, uranium was analysed in ground water samples

  2. Analytical use of electron accelerators

    International Nuclear Information System (INIS)

    Kapitsa, S.P.; Chapyzhnikov, B.A.; Firsov, V.I.; Samosyuk, V.N.; Tsipenyuk, Y.M.

    1985-01-01

    After detailed investigation the authors conclude that the newest electron accelerators provide good scope for gamma activation and also for producing neutrons for neutron activation. These accelerators are simpler and safer than reactors, and one can provide fairly homogeneous irradiation of substantial volumes, and the determination speed and sensitivity then constitute the main advantages. The limits of detection and the reproducibility are sufficient to handle a wide range of tasks. Analysts at present face a wide range of unlikely extreme problems, while the selectivity provides exceptional analysis facilities. However, the record examples are not to be taken as exceptions, since activation analysis based on electron accelerators opens up essentially universal scope for analyzing all elements at the concentrations and accuracies currently involved, which will involve its extensive use in analytical practice in the foreseeable future. The authors indicate that the recognition of these possibilities governs the general use of these methods and the employment of current efficient fast-electron sources to implement them

  3. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  4. Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)

    Science.gov (United States)

    Bishop, M. P.; Houser, C.; Lemmons, K.

    2015-12-01

    Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.

  5. Heat Conduction Analysis Using Semi Analytical Finite Element Method

    International Nuclear Information System (INIS)

    Wargadipura, A. H. S.

    1997-01-01

    Heat conduction problems are very often found in science and engineering fields. It is of accrual importance to determine quantitative descriptions of this important physical phenomena. This paper discusses the development and application of a numerical formulation and computation that can be used to analyze heat conduction problems. The mathematical equation which governs the physical behaviour of heat conduction is in the form of second order partial differential equations. The numerical resolution used in this paper is performed using the finite element method and Fourier series, which is known as semi-analytical finite element methods. The numerical solution results in simultaneous algebraic equations which is solved using the Gauss elimination methodology. The computer implementation is carried out using FORTRAN language. In the final part of the paper, a heat conduction problem in a rectangular plate domain with isothermal boundary conditions in its edge is solved to show the application of the computer program developed and also a comparison with analytical solution is discussed to assess the accuracy of the numerical solution obtained

  6. Magnetically-driven medical robots: An analytical magnetic model for endoscopic capsules design

    Science.gov (United States)

    Li, Jing; Barjuei, Erfan Shojaei; Ciuti, Gastone; Hao, Yang; Zhang, Peisen; Menciassi, Arianna; Huang, Qiang; Dario, Paolo

    2018-04-01

    Magnetic-based approaches are highly promising to provide innovative solutions for the design of medical devices for diagnostic and therapeutic procedures, such as in the endoluminal districts. Due to the intrinsic magnetic properties (no current needed) and the high strength-to-size ratio compared with electromagnetic solutions, permanent magnets are usually embedded in medical devices. In this paper, a set of analytical formulas have been derived to model the magnetic forces and torques which are exerted by an arbitrary external magnetic field on a permanent magnetic source embedded in a medical robot. In particular, the authors modelled cylindrical permanent magnets as general solution often used and embedded in magnetically-driven medical devices. The analytical model can be applied to axially and diametrically magnetized, solid and annular cylindrical permanent magnets in the absence of the severe calculation complexity. Using a cylindrical permanent magnet as a selected solution, the model has been applied to a robotic endoscopic capsule as a pilot study in the design of magnetically-driven robots.

  7. Big Data for Global History: The Transformative Promise of Digital Humanities

    Directory of Open Access Journals (Sweden)

    Joris van Eijnatten

    2013-12-01

    Full Text Available This article discusses the promises and challenges of digital humanitiesmethodologies for historical inquiry. In order to address the great outstanding question whether big data will re-invigorate macro-history, a number of research projects are described that use cultural text mining to explore big data repositories of digitised newspapers. The advantages of quantitative analysis, visualisation and named entity recognition in both exploration and analysis are illustrated in the study of public debates on drugs, drug trafficking, and drug users in the early twentieth century (wahsp, the comparative study of discourses about heredity, genetics, and eugenics in Dutch and German newspapers, 1863-1940 (biland and the study of trans-Atlantic discourses (Translantis. While many technological and practical obstacles remain, advantages over traditional hermeneutic methodology are found in heuristics, analytics, quantitative trans-disciplinarity, and reproducibility, offering a quantitative and trans-national perspective on the history of mentalities.

  8. Keeping the Promise

    Science.gov (United States)

    Whissemore, Tabitha

    2016-01-01

    Since its launch in September 2015, Heads Up America has collected information on nearly 125 promise programs across the country, many of which were instituted long before President Barack Obama announced the America's College Promise (ACP) plan in 2015. At least 27 new free community college programs have launched in states, communities, and at…

  9. Analytical Characterisation of Nanoscale Zero-Valent Iron: A ...

    Science.gov (United States)

    Zero-valent iron nanoparticles (nZVI) have been widely tested as they are showing significant promise for environmental remediation. However, many recent studies have demonstrated that their mobility and reactivity in subsurface environments are significantly affected by their tendency to aggregate. Both the mobility and reactivity of nZVI mainly depends on properties such as particle size, surface chemistry and bulk composition. In order to ensure efficient remediation, it is crucial to accurately assess and understand the implications of these properties before deploying these materials into contaminated environments. Many analytical techniques are now available to determine these parameters and this paper provides a critical review of their usefulness and limitations for nZVI characterisation. These analytical techniques include microscopy and light scattering techniques for the determination of particle size, size distribution and aggregation state, and X-ray techniques for the characterisation of surface chemistry and bulk composition. Example characterisation data derived from commercial nZVI materials is used to further illustrate method strengths and limitations. Finally, some important challenges with respect to the characterisation of nZVI in groundwater samples are discussed. In recent years, manufactured nanoparticles (MNPs) have attracted increasing interest for their potential applications in the treatment of contaminated soil and water. In compar

  10. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  11. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keller, J.; Wallen, R.; Errichello, R.; Halse, C.; Lambert, S.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  12. Library analytics and metrics using data to drive decisions and services

    CERN Document Server

    2015-01-01

    This book will enable libraries to make informed decisions, develop new services and improve user experience by collecting, analysing and utilising data. With the wealth of data available to library and information services, analytics are the key to understanding your users and your field of operations better and improving the services that you offer. This book sets out the opportunities that analytics present to libraries, and provides inspiration for how they can use the data within their systems to help inform decisions and drive services. Using case studies to provide real-life examples of current developments and services, and packed full of practical advice and guidance for libraries looking to realise the value of their data, this will be an essential guide for librarians and information professionals. This volume will bring together a group of internationally recognised experts to explore some of the key issues in the exploitation of data analytics and metrics in the library and cultural heritage sect...

  13. Leveraging data rich environments using marketing analytics

    OpenAIRE

    Holtrop, Niels

    2017-01-01

    With the onset of what is popularly known as “big data”, increased attention is being paid to creating value from these data rich environments. Within the field of marketing, the analysis of customer and market data supported by models is known as marketing analytics. The goal of these analyses is to enhance managerial decision making regarding marketing problems. However, before these data rich environments can be used to guide managerial decision making, firms need to grasp the process of d...

  14. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  15. Assessing Adult Learning Preferences Using the Analytic Hierarchy Process.

    Science.gov (United States)

    Lee, Doris; McCool, John; Napieralski, Laura

    2000-01-01

    Graduate students (n=134) used the analytic hierarchy process, which weights expressed preferences, to rate four learning activities: lectures, discussion/reflection, individual projects, and group projects. Their preferences for discussion/reflection and individual projects were independent of auditory, visual, and kinesthetic learning styles.…

  16. FORECASTING PILE SETTLEMENT ON CLAYSTONE USING NUMERICAL AND ANALYTICAL METHODS

    Directory of Open Access Journals (Sweden)

    Ponomarev Andrey Budimirovich

    2016-06-01

    Full Text Available In the article the problem of designing pile foundations on claystones is reviewed. The purpose of this paper is comparative analysis of the analytical and numerical methods for forecasting the settlement of piles on claystones. The following tasks were solved during the study: 1 The existing researches of pile settlement are analyzed; 2 The characteristics of experimental studies and the parameters for numerical modeling are presented, methods of field research of single piles’ operation are described; 3 Calculation of single pile settlement is performed using numerical methods in the software package Plaxis 2D and analytical method according to the requirements SP 24.13330.2011; 4 Experimental data is compared with the results of analytical and numerical calculations; 5 Basing on these results recommendations for forecasting pile settlement on claystone are presented. Much attention is paid to the calculation of pile settlement considering the impacted areas in ground space beside pile and the comparison with the results of field experiments. Basing on the obtained results, for the prediction of settlement of single pile on claystone the authors recommend using the analytical method considered in SP 24.13330.2011 with account for the impacted areas in ground space beside driven pile. In the case of forecasting the settlement of single pile on claystone by numerical methods in Plaxis 2D the authors recommend using the Hardening Soil model considering the impacted areas in ground space beside the driven pile. The analyses of the results and calculations are presented for examination and verification; therefore it is necessary to continue the research work of deep foundation at another experimental sites to improve the reliability of the calculation of pile foundation settlement. The work is of great interest for geotechnical engineers engaged in research, design and construction of pile foundations.

  17. Analytic confidence level calculations using the likelihood ratio and fourier transform

    International Nuclear Information System (INIS)

    Hu Hongbo; Nielsen, J.

    2000-01-01

    The interpretation of new particle search results involves a confidence level calculation on either the discovery hypothesis or the background-only ('null') hypothesis. A typical approach uses toy Monte Carlo experiments to build an expected experiment estimator distribution against which an observed experiment's estimator may be compared. In this note, a new approach is presented which calculates analytically the experiment estimator distribution via a Fourier transform, using the likelihood ratio as an ordering estimator. The analytic approach enjoys an enormous speed advantage over the toy Monte Carlo method, making it possible to quickly and precisely calculate confidence level results

  18. Towards actionable learning analytics using dispositions

    NARCIS (Netherlands)

    Tempelaar, Dirk; Rienties, Bart; Nguyen, Quan

    2017-01-01

    Studies in the field of learning analytics (LA) have shown students’ demographics and learning management system (LMS) data to be effective identifiers of “at risk” performance. However, insights generated by these predictive models may not be suitable for pedagogically informed interventions due to

  19. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    Science.gov (United States)

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  20. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  1. Understanding the promises and premises of online health platforms

    Directory of Open Access Journals (Sweden)

    José Van Dijck

    2016-06-01

    Full Text Available This article investigates the claims and complexities involved in the platform-based economics of health and fitness apps. We examine a double-edged logic inscribed in these platforms, promising to offer personal solutions to medical problems while also contributing to the public good. On the one hand, online platforms serve as personalized data-driven services to their customers. On the other hand, they allegedly serve public interests, such as medical research or health education. In doing so, many apps employ a diffuse discourse, hinging on terms like “sharing,” “open,” and “reuse” when they talk about data extraction and distribution. The analytical approach we adopt in this article is situated at the nexus of science and technology studies, political economy, and the sociology of health and illness. The analysis concentrates on two aspects: datafication (the use and reuse of data and commodification (a platform’s deployment of governance and business models. We apply these analytical categories to three specific platforms: 23andMe, PatientsLikeMe, and Parkinson mPower. The last section will connect these individual examples to the wider implications of health apps’ data flows, governance policies, and business models. Regulatory bodies commonly focus on the (medical safety and security of apps, but pay scarce attention to health apps’ techno-economic governance. Who owns user-generated health data and who gets to benefit? We argue that it is important to reflect on the societal implications of health data markets. Governments have the duty to provide conceptual clarity in the grand narrative of transforming health care and health research.

  2. Spin-Stabilized Spacecrafts: Analytical Attitude Propagation Using Magnetic Torques

    Directory of Open Access Journals (Sweden)

    Roberta Veloso Garcia

    2009-01-01

    Full Text Available An analytical approach for spin-stabilized satellites attitude propagation is presented, considering the influence of the residual magnetic torque and eddy currents torque. It is assumed two approaches to examine the influence of external torques acting during the motion of the satellite, with the Earth's magnetic field described by the quadripole model. In the first approach is included only the residual magnetic torque in the motion equations, with the satellites in circular or elliptical orbit. In the second approach only the eddy currents torque is analyzed, with the satellite in circular orbit. The inclusion of these torques on the dynamic equations of spin stabilized satellites yields the conditions to derive an analytical solution. The solutions show that residual torque does not affect the spin velocity magnitude, contributing only for the precession and the drift of the spacecraft's spin axis and the eddy currents torque causes an exponential decay of the angular velocity magnitude. Numerical simulations performed with data of the Brazilian Satellites (SCD1 and SCD2 show the period that analytical solution can be used to the attitude propagation, within the dispersion range of the attitude determination system performance of Satellite Control Center of Brazil National Research Institute.

  3. Seamless Digital Environment – Plan for Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron Douglas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  4. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  5. Towards Actionable Learning Analytics Using Dispositions

    Science.gov (United States)

    Tempelaar, Dirk T.; Rienties, Bart; Nguyen, Quan

    2017-01-01

    Studies in the field of learning analytics (LA) have shown students' demographics and learning management system (LMS) data to be effective identifiers of "at risk" performance. However, insights generated by these predictive models may not be suitable for pedagogically informed interventions due to the inability to explain why students…

  6. The Analytic Information Warehouse (AIW): a platform for analytics using electronic health record data.

    Science.gov (United States)

    Post, Andrew R; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H

    2013-06-01

    To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in 5years of data from our institution's clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. An analytically resolved model of a potato's thermal processing using Heun functions

    Science.gov (United States)

    Vargas Toro, Agustín.

    2014-05-01

    A potato's thermal processing model is solved analytically. The model is formulated using the equation of heat diffusion in the case of a spherical potato processed in a furnace, and assuming that the potato's thermal conductivity is radially modulated. The model is solved using the method of the Laplace transform, applying Bromwich Integral and Residue Theorem. The temperatures' profile in the potato is presented as an infinite series of Heun functions. All computations are performed with computer algebra software, specifically Maple. Using the numerical values of the thermal parameters of the potato and geometric and thermal parameters of the processing furnace, the time evolution of the temperatures in different regions inside the potato are presented analytically and graphically. The duration of thermal processing in order to achieve a specified effect on the potato is computed. It is expected that the obtained analytical results will be important in food engineering and cooking engineering.

  8. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  9. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Changes in Visual/Spatial and Analytic Strategy Use in Organic Chemistry with the Development of Expertise

    Science.gov (United States)

    Vlacholia, Maria; Vosniadou, Stella; Roussos, Petros; Salta, Katerina; Kazi, Smaragda; Sigalas, Michael; Tzougraki, Chryssa

    2017-01-01

    We present two studies that investigated the adoption of visual/spatial and analytic strategies by individuals at different levels of expertise in the area of organic chemistry, using the Visual Analytic Chemistry Task (VACT). The VACT allows the direct detection of analytic strategy use without drawing inferences about underlying mental…

  11. Using the Analytic Hierarchy Process to Analyze Multiattribute Decisions.

    Science.gov (United States)

    Spires, Eric E.

    1991-01-01

    The use of the Analytic Hierarchy Process (AHP) in assisting researchers to analyze decisions is discussed. The AHP is compared with other decision-analysis techniques, including multiattribute utility measurement, conjoint analysis, and general linear models. Insights that AHP can provide are illustrated with data gathered in an auditing context.…

  12. Optimizing an Immersion ESL Curriculum Using Analytic Hierarchy Process

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-01-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative…

  13. Preliminary analytical study on the feasibility of using reinforced concrete pile foundations for renewable energy storage by compressed air energy storage technology

    Science.gov (United States)

    Tulebekova, S.; Saliyev, D.; Zhang, D.; Kim, J. R.; Karabay, A.; Turlybek, A.; Kazybayeva, L.

    2017-11-01

    Compressed air energy storage technology is one of the promising methods that have high reliability, economic feasibility and low environmental impact. Current applications of the technology are mainly limited to energy storage for power plants using large scale underground caverns. This paper explores the possibility of making use of reinforced concrete pile foundations to store renewable energy generated from solar panels or windmills attached to building structures. The energy will be stored inside the pile foundation with hollow sections via compressed air. Given the relatively small volume of storage provided by the foundation, the required storage pressure is expected to be higher than that in the large-scale underground cavern. The high air pressure typically associated with large temperature increase, combined with structural loads, will make the pile foundation in a complicated loading condition, which might cause issues in the structural and geotechnical safety. This paper presents a preliminary analytical study on the performance of the pile foundation subjected to high pressure, large temperature increase and structural loads. Finite element analyses on pile foundation models, which are built from selected prototype structures, have been conducted. The analytical study identifies maximum stresses in the concrete of the pile foundation under combined pressure, temperature change and structural loads. Recommendations have been made for the use of reinforced concrete pile foundations for renewable energy storage.

  14. Use of scientometrics to assess nuclear and other analytical methods

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1986-01-01

    Scientometrics involves the use of quantitative methods to investigate science viewed as an information process. Scientometric studies can be useful in ascertaining which methods have been most employed for various analytical determinations as well as for predicting which methods will continue to be used in the immediate future and which appear to be losing favor with the analytical community. Published papers in the technical literature are the primary source materials for scientometric studies; statistical methods and computer techniques are the tools. Recent studies have included growth and trends in prompt nuclear analysis impact of research published in a technical journal, and institutional and national representation, speakers and topics at several IAEA conferences, at modern trends in activation analysis conferences, and at other non-nuclear oriented conferences. Attempts have also been made to predict future growth of various topics and techniques. 13 refs., 4 figs., 17 tabs

  15. Use of scientometrics to assess nuclear and other analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Lyon, W.S.

    1986-01-01

    Scientometrics involves the use of quantitative methods to investigate science viewed as an information process. Scientometric studies can be useful in ascertaining which methods have been most employed for various analytical determinations as well as for predicting which methods will continue to be used in the immediate future and which appear to be losing favor with the analytical community. Published papers in the technical literature are the primary source materials for scientometric studies; statistical methods and computer techniques are the tools. Recent studies have included growth and trends in prompt nuclear analysis impact of research published in a technical journal, and institutional and national representation, speakers and topics at several IAEA conferences, at modern trends in activation analysis conferences, and at other non-nuclear oriented conferences. Attempts have also been made to predict future growth of various topics and techniques. 13 refs., 4 figs., 17 tabs.

  16. Analytical solution using computer algebra of a biosensor for detecting toxic substances in water

    Science.gov (United States)

    Rúa Taborda, María. Isabel

    2014-05-01

    In a relatively recent paper an electrochemical biosensor for water toxicity detection based on a bio-chip as a whole cell was proposed and numerically solved and analyzed. In such paper the kinetic processes in a miniaturized electrochemical biosensor system was described using the equations for specific enzymatic reaction and the diffusion equation. The numerical solution shown excellent agreement with the measured data but such numerical solution is not enough to design efficiently the corresponding bio-chip. For this reason an analytical solution is demanded. The object of the present work is to provide such analytical solution and then to give algebraic guides to design the bio-sensor. The analytical solution is obtained using computer algebra software, specifically Maple. The method of solution is the Laplace transform, with Bromwich integral and residue theorem. The final solution is given as a series of Bessel functions and the effective time for the bio-sensor is computed. It is claimed that the analytical solutions that were obtained will be very useful to predict further current variations in similar systems with different geometries, materials and biological components. Beside of this the analytical solution that we provide is very useful to investigate the relationship between different chamber parameters such as cell radius and height; and electrode radius.

  17. Directed transport by surface chemical potential gradients for enhancing analyte collection in nanoscale sensors.

    Science.gov (United States)

    Sitt, Amit; Hess, Henry

    2015-05-13

    Nanoscale detectors hold great promise for single molecule detection and the analysis of small volumes of dilute samples. However, the probability of an analyte reaching the nanosensor in a dilute solution is extremely low due to the sensor's small size. Here, we examine the use of a chemical potential gradient along a surface to accelerate analyte capture by nanoscale sensors. Utilizing a simple model for transport induced by surface binding energy gradients, we study the effect of the gradient on the efficiency of collecting nanoparticles and single and double stranded DNA. The results indicate that chemical potential gradients along a surface can lead to an acceleration of analyte capture by several orders of magnitude compared to direct collection from the solution. The improvement in collection is limited to a relatively narrow window of gradient slopes, and its extent strongly depends on the size of the gradient patch. Our model allows the optimization of gradient layouts and sheds light on the fundamental characteristics of chemical potential gradient induced transport.

  18. On-chip bio-analyte detection utilizing the velocity of magnetic microparticles in a fluid

    KAUST Repository

    Giouroudi, Ioanna

    2011-03-22

    A biosensing principle utilizing the motion of suspended magnetic microparticles in a microfluidic system is presented. The system utilizes the innovative concept of the velocity dependence of magnetic microparticles (MPs) due to their volumetric change when analyte is attached to their surface via antibody–antigen binding. When the magnetic microparticles are attracted by a magnetic field within a microfluidic channel their velocity depends on the presence of analyte. Specifically, their velocity decreases drastically when the magnetic microparticles are covered by (nonmagnetic) analyte (LMPs) due to the increased drag force in the opposite direction to that of the magnetic force. Experiments were carried out as a proof of concept. A promising 52% decrease in the velocity of the LMPs in comparison to that of the MPs was measured when both of them were accelerated inside a microfluidic channel using an external permanent magnet. The presented biosensing methodology offers a compact and integrated solution for a new kind of on-chip analysis with potentially high sensitivity and shorter acquisition time than conventional laboratory based systems.

  19. Analytical Propagation of Uncertainty in Life Cycle Assessment Using Matrix Formulation

    DEFF Research Database (Denmark)

    Imbeault-Tétreault, Hugues; Jolliet, Olivier; Deschênes, Louise

    2013-01-01

    with Monte Carlo results. The sensitivity and contribution of input parameters to output uncertainty were also analytically calculated. This article outlines an uncertainty analysis of the comparison between two case study scenarios. We conclude that the analytical method provides a good approximation...... on uncertainty calculation. This article shows the importance of the analytical method in uncertainty calculation, which could lead to a more complete uncertainty analysis in LCA practice....... uncertainty assessment is not a regular step in LCA. An analytical approach based on Taylor series expansion constitutes an effective means to overcome the drawbacks of the Monte Carlo method. This project aimed to test the approach on a real case study, and the resulting analytical uncertainty was compared...

  20. Green analytical chemistry - the use of surfactants as a replacement of organic solvents in spectroscopy

    Science.gov (United States)

    Pharr, Daniel Y.

    2017-07-01

    This chapter gives an introduction to the many practical uses of surfactants in analytical chemistry in replacing organic solvents to achieve greener chemistry. Taking a holistic approach, it covers some background of surfactants as chemical solvents, their properties and as green chemicals, including their environmental effects. The achievements of green analytical chemistry with micellar systems are reviewed in all the major areas of analytical chemistry where these reagents have been found to be useful.

  1. Analytical modeling of glucose biosensors based on carbon nanotubes.

    Science.gov (United States)

    Pourasl, Ali H; Ahmadi, Mohammad Taghi; Rahmani, Meisam; Chin, Huei Chaeng; Lim, Cheng Siong; Ismail, Razali; Tan, Michael Loong Peng

    2014-01-15

    In recent years, carbon nanotubes have received widespread attention as promising carbon-based nanoelectronic devices. Due to their exceptional physical, chemical, and electrical properties, namely a high surface-to-volume ratio, their enhanced electron transfer properties, and their high thermal conductivity, carbon nanotubes can be used effectively as electrochemical sensors. The integration of carbon nanotubes with a functional group provides a good and solid support for the immobilization of enzymes. The determination of glucose levels using biosensors, particularly in the medical diagnostics and food industries, is gaining mass appeal. Glucose biosensors detect the glucose molecule by catalyzing glucose to gluconic acid and hydrogen peroxide in the presence of oxygen. This action provides high accuracy and a quick detection rate. In this paper, a single-wall carbon nanotube field-effect transistor biosensor for glucose detection is analytically modeled. In the proposed model, the glucose concentration is presented as a function of gate voltage. Subsequently, the proposed model is compared with existing experimental data. A good consensus between the model and the experimental data is reported. The simulated data demonstrate that the analytical model can be employed with an electrochemical glucose sensor to predict the behavior of the sensing mechanism in biosensors.

  2. Environmental vulnerability assessment using Grey Analytic Hierarchy Process based model

    International Nuclear Information System (INIS)

    Sahoo, Satiprasad; Dhar, Anirban; Kar, Amlanjyoti

    2016-01-01

    Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, wind speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.

  3. Environmental vulnerability assessment using Grey Analytic Hierarchy Process based model

    Energy Technology Data Exchange (ETDEWEB)

    Sahoo, Satiprasad [School of Water Resources, Indian Institute of Technology Kharagpur (India); Dhar, Anirban, E-mail: anirban.dhar@gmail.com [Department of Civil Engineering, Indian Institute of Technology Kharagpur (India); Kar, Amlanjyoti [Central Ground Water Board, Bhujal Bhawan, Faridabad, Haryana (India)

    2016-01-15

    Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, wind speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.

  4. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  5. Analytical SN solutions in heterogeneous slabs using symbolic algebra computer programs

    International Nuclear Information System (INIS)

    Warsa, J.S.

    2002-01-01

    A modern symbolic algebra computer program, MAPLE, is used to compute solutions to the well-known analytical discrete ordinates, or S N , solutions in one-dimensional, slab geometry. Symbolic algebra programs compute the solutions with arbitrary precision and are free of spatial discretization error so they can be used to investigate new discretizations for one-dimensional slab, geometry S N methods. Pointwise scalar flux solutions are computed for several sample calculations of interest. Sample MAPLE command scripts are provided to illustrate how easily the theory can be translated into a working solution and serve as a complete tool capable of computing analytical S N solutions for mono-energetic, one-dimensional transport problems

  6. Using analytic continuation for the hadronic vacuum polarization computation

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Xu; Hashimoto, Shoji; Hotzel, Grit; Jansen, Karl; Petschlies, Marcus; B, Renner Dru

    2014-11-01

    We present two examples of applications of the analytic continuation method for computing the hadronic vacuum polarization function in space- and time-like momentum regions. These examples are the Adler function and the leading order hadronic contribution to the muon anomalous magnetic moment. We comment on the feasibility of the analytic continuation method and provide an outlook for possible further applications.

  7. The Analytic Information Warehouse (AIW): a Platform for Analytics using Electronic Health Record Data

    Science.gov (United States)

    Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.

    2013-01-01

    Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960

  8. Prompt nuclear analytical techniques for material research in accelerator driven transmutation technologies: Prospects and quantitative analyses

    International Nuclear Information System (INIS)

    Vacik, J.; Hnatowicz, V.; Cervena, J.; Perina, V.; Mach, R.

    1998-01-01

    Accelerator driven transmutation technology (ADTT) is a promising way toward liquidation of spent nuclear fuel, nuclear wastes and weapon grade Pu. The ADTT facility comprises a high current (proton) accelerator supplying a sub-critical reactor assembly with spallation neutrons. The reactor part is supposed to be cooled by molten fluorides or metals which serve, at the same time, as a carrier of nuclear fuel. Assumed high working temperature (400-600 C) and high radiation load in the subcritical reactor and spallation neutron source put forward the problem of optimal choice of ADTT construction materials, especially from the point of their radiation and corrosion resistance when in contact with liquid working media. The use of prompt nuclear analytical techniques in ADTT related material research is considered and examples of preliminary analytical results obtained using neutron depth profiling method are shown for illustration. (orig.)

  9. Analytic degree distributions of horizontal visibility graphs mapped from unrelated random series and multifractal binomial measures

    Science.gov (United States)

    Xie, Wen-Jie; Han, Rui-Qi; Jiang, Zhi-Qiang; Wei, Lijian; Zhou, Wei-Xing

    2017-08-01

    Complex network is not only a powerful tool for the analysis of complex system, but also a promising way to analyze time series. The algorithm of horizontal visibility graph (HVG) maps time series into graphs, whose degree distributions are numerically and analytically investigated for certain time series. We derive the degree distributions of HVGs through an iterative construction process of HVGs. The degree distributions of the HVG and the directed HVG for random series are derived to be exponential, which confirms the analytical results from other methods. We also obtained the analytical expressions of degree distributions of HVGs and in-degree and out-degree distributions of directed HVGs transformed from multifractal binomial measures, which agree excellently with numerical simulations.

  10. Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project

    OpenAIRE

    Gelan, Anouk; Fastré, Greet; Verjans, Martine; Martin, Niels; Janssenswillen, Gert; Creemers, Mathijs; Lieben, Jonas; Depaire, Benoît; Thomas, Michael

    2018-01-01

    Learning analytics (LA) has emerged as a field that offers promising new ways to prevent drop-out and aid retention. However, other research suggests that large datasets of learner activity can be used to understand online learning behaviour and improve pedagogy. While the use of LA in language learning has received little attention to date, available research suggests that LA could provide valuable insights into task design for instructors and materials designers, as well as help students wi...

  11. Using the Technology of the Confessional as an Analytical Resource: Four Analytical Stances Towards Research Interviews in Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Brendan K. O'Rourke

    2007-05-01

    Full Text Available Among the various approaches that have developed from FOUCAULT's work is an Anglophone discourse analysis that has attempted to combine FOUCAULTian insights with the techniques of Conversation Analysis. An important current methodological issue in this discourse analytical approach is its theoretical preference for "naturally occurring" rather than research interview data. A FOUCAULTian perspective on the interview as a research instrument, questions the idea of "naturally-occurring discourse". The "technology of the confessional" operates, not only within research interviews, but permeates other interactions as well. Drawing on FOUCAULT does not dismiss the problems of the interview as research instrument rather it shows they cannot be escaped by simply switching to more "natural" interactions. Combining these insights with recent developments within discourse analysis can provide analytical resources for, rather than barriers to, the discourse analysis of research interviews. To aid such an approach, we develop a four-way categorisation of analytical stances towards the research interview in discourse analysis. A demonstration of how a research interview might be subjected to a discourse analysis using elements of this approach is then provided. URN: urn:nbn:de:0114-fqs070238

  12. Analytical implications of using practice theory in workplace information literacy research

    DEFF Research Database (Denmark)

    Moring, Camilla Elisabeth; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical...... focus and interest when researching workplace information literacy. Two practice theoretical perspectives are selected, one by Theodore Schatzki and one by Etienne Wenger, and their general commonalities and differences are analysed and discussed. Analysis: The two practice theories and their main ideas...... of what constitute practices, how practices frame social life and the central concepts used to explain this, are presented. Then the application of the theories within workplace information literacy research is briefly explored. Results and Conclusion: The two theoretical perspectives share some...

  13. Establishing an ISO 10001-based promise in inpatients care.

    Science.gov (United States)

    Khan, Mohammad Ashiqur Rahman; Karapetrovic, Stanislav

    2015-01-01

    The purpose of this paper is to explore ISO 10001:2007 in planning, designing and developing a customer satisfaction promise (CSP) intended for inpatients care. Through meetings and interviews with research participants, who included a program manager, unit managers and registered nurses, information about potential promises and their implementation was obtained and analyzed. A number of promises were drafted and one was finally selected to be developed as a CSP. Applying the standard required adaptation and novel interpretation. Additionally, ISO 10002:2004 (Clause 7) was used to design the feedback handling activities. A promise initially chosen for development turned out to be difficult to implement, experience that helped in selecting and developing the final promise. Research participants found the ISO 10001-based method useful and comprehensible. This paper presents a specific health care example of how to adapt a standard's guideline in establishing customer promises. The authors show how a promise can be used in alleviating an existing issue (i.e. communication between carers and patients). The learning can be beneficial in various health care settings. To the knowledge, this paper shows the first example of applying ISO 10001:2007 in a health care case. A few activities suggested by the standard are further detailed, and a new activity is introduced. The integrated use of ISO 10001:2007 and 10002:2004 is presented and how one can be "augmented" by the other is demonstrated.

  14. Investing in America's Data Science and Analytics Talent: The Case for Action

    Science.gov (United States)

    Business-Higher Education Forum, 2017

    2017-01-01

    Increasingly US jobs require data science and analytics skills. Can we meet the demand? The current shortage of skills in the national job pool demonstrates that business-as-usual strategies won't satisfy the growing need. If we are to unlock the promise and potential of data and all the technologies that depend on it, employers and educators will…

  15. eAnalytics: Dynamic Web-based Analytics for the Energy Industry

    Directory of Open Access Journals (Sweden)

    Paul Govan

    2016-11-01

    Full Text Available eAnalytics is a web application built on top of R that provides dynamic data analytics to energy industry stakeholders. The application allows users to dynamically manipulate chart data and style through the Shiny package’s reactive framework. eAnalytics currently supports a number of features including interactive datatables, dynamic charting capabilities, and the ability to save, download, or export information for further use. Going forward, the goal for this project is that it will serve as a research hub for discovering new relationships in the data. The application is illustrated with a simple tutorial of the user interface design.

  16. Embracing Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge

    Science.gov (United States)

    Macfadyen, Leah P.; Dawson, Shane; Pardo, Abelardo; Gaševic, Dragan

    2014-01-01

    In the new era of big educational data, learning analytics (LA) offer the possibility of implementing real-time assessment and feedback systems and processes at scale that are focused on improvement of learning, development of self-regulated learning skills, and student success. However, to realize this promise, the necessary shifts in the…

  17. The Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn

    2007-01-01

    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  18. Stem cell therapy in spinal cord injury: Hollow promise or promising science?

    Directory of Open Access Journals (Sweden)

    Aimee Goel

    2016-01-01

    Full Text Available Spinal cord injury (SCI remains one of the most physically, psychologically and socially debilitating conditions worldwide. While rehabilitation measures may help limit disability to some extent, there is no effective primary treatment yet available. The efficacy of stem cells as a primary therapeutic option in spinal cord injury is currently an area under much scrutiny and debate. Several laboratory and some primary clinical studies into the use of bone marrow mesenchymal stem cells or embryonic stem cell-derived oligodentrocyte precursor cells have shown some promising results in terms of remyelination and regeneration of damaged spinal nerve tracts. More recently,laboratory and early clinical experiments into the use of Olfactory Ensheathing Cells, a type of glial cell derived from olfactory bulb and mucosa have provided some phenomenal preliminary evidence as to their neuroregenerative and neural bridging capacity. This report compares and evaluates some current research into selected forms of embryonic and mesenchymal stem cell therapy as well as olfactory ensheathing cell therapy in SCI, and also highlights some legal and ethical issues surrounding their use. While early results shows promise, more rigorous large scaleclinical trials are needed to shed light on the safety, efficacy and long term viability of stem cell and cellular transplant techniques in SCI.

  19. Analytics that Inform the University: Using Data You Already Have

    Science.gov (United States)

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  20. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  1. Artificial Intelligence in Surgery: Promises and Perils.

    Science.gov (United States)

    Hashimoto, Daniel A; Rosman, Guy; Rus, Daniela; Meireles, Ozanan R

    2018-07-01

    The aim of this review was to summarize major topics in artificial intelligence (AI), including their applications and limitations in surgery. This paper reviews the key capabilities of AI to help surgeons understand and critically evaluate new AI applications and to contribute to new developments. AI is composed of various subfields that each provide potential solutions to clinical problems. Each of the core subfields of AI reviewed in this piece has also been used in other industries such as the autonomous car, social networks, and deep learning computers. A review of AI papers across computer science, statistics, and medical sources was conducted to identify key concepts and techniques within AI that are driving innovation across industries, including surgery. Limitations and challenges of working with AI were also reviewed. Four main subfields of AI were defined: (1) machine learning, (2) artificial neural networks, (3) natural language processing, and (4) computer vision. Their current and future applications to surgical practice were introduced, including big data analytics and clinical decision support systems. The implications of AI for surgeons and the role of surgeons in advancing the technology to optimize clinical effectiveness were discussed. Surgeons are well positioned to help integrate AI into modern practice. Surgeons should partner with data scientists to capture data across phases of care and to provide clinical context, for AI has the potential to revolutionize the way surgery is taught and practiced with the promise of a future optimized for the highest quality patient care.

  2. The use of decision analytic techniques in energy policy decisions

    International Nuclear Information System (INIS)

    Haemaelaeinen, R.P.; Seppaelaeinen, T.O.

    1986-08-01

    The report reviews decision analytic techniques and their applications to energy policy decision making. Decision analysis consists in techniques for structuring the essential elements of a decision problem and mathematical methods for ranking the alternatives from a set of simple judgments. Because modeling subjective judgments is characteristic of decision analysis, the models can incorporate qualitative factors and values, which escape traditional energy modeling. Decision analysis has been applied to choices among energy supply alternatives, siting energy facilities, selecting nuclear waste repositories, selecting research and development projects, risk analysis and prioritizing alternative energy futures. Many applications are done in universities and research institutions, but during the 70's the use of decision analysis has spread both to the public and the private sector. The settings where decision analysis has been applied range from aiding a single decision maker to clarifying opposing points of view. Decision analytic methods have also been linked with energy models. The most valuable result of decision analysis is the clarification of the problem at hand. Political decisions cannot be made solely on the basis of models, but models can be used to gain insight of the decision situation. Models inevitably simplify reality, so they must be regarded only as aids to judgment. So far there has been only one decision analysis of energy policy issues in Finland with actual political decision makers as participants. The experiences of this project and numerous foreign applications do however suggest that the decision analytic approach is useful in energy policy questions. The report presents a number of Finnish energy policy decisions where decision analysis might prove useful. However, the applicability of the methods depends crucially on the actual circumstances at hand

  3. Broadening the Scope and Increasing the Usefulness of Learning Analytics:

    Science.gov (United States)

    Ellis, Cath

    2013-01-01

    Learning analytics is a relatively new field of inquiry and its precise meaning is both contested and fluid (Johnson, Smith, Willis, Levine & Haywood, 2011; LAK, n.d.). Ferguson (2012) suggests that the best working definition is that offered by the first Learning Analytics and Knowledge (LAK) conference: "the measurement, collection,…

  4. Advances in downstream processing of biologics - Spectroscopy: An emerging process analytical technology.

    Science.gov (United States)

    Rüdt, Matthias; Briskot, Till; Hubbuch, Jürgen

    2017-03-24

    Process analytical technologies (PAT) for the manufacturing of biologics have drawn increased interest in the last decade. Besides being encouraged by the Food and Drug Administration's (FDA's) PAT initiative, PAT promises to improve process understanding, reduce overall production costs and help to implement continuous manufacturing. This article focuses on spectroscopic tools for PAT in downstream processing (DSP). Recent advances and future perspectives will be reviewed. In order to exploit the full potential of gathered data, chemometric tools are widely used for the evaluation of complex spectroscopic information. Thus, an introduction into the field will be given. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two...

  6. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    Science.gov (United States)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  7. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  8. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  9. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    Science.gov (United States)

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  10. Analytic nuclear scattering theories

    International Nuclear Information System (INIS)

    Di Marzio, F.; University of Melbourne, Parkville, VIC

    1999-01-01

    A wide range of nuclear reactions are examined in an analytical version of the usual distorted wave Born approximation. This new approach provides either semi analytic or fully analytic descriptions of the nuclear scattering processes. The resulting computational simplifications, when used within the limits of validity, allow very detailed tests of both nuclear interaction models as well as large basis models of nuclear structure to be performed

  11. Using Analytic Hierarchy Process in Textbook Evaluation

    Science.gov (United States)

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  12. Quantification of process induced disorder in milled samples using different analytical techniques

    DEFF Research Database (Denmark)

    Zimper, Ulrike; Aaltonen, Jaakko; McGoverin, Cushla M.

    2012-01-01

    The aim of this study was to compare three different analytical methods to detect and quantify the amount of crystalline disorder/ amorphousness in two milled model drugs. X-ray powder diffraction (XRPD), differential scanning calorimetry (DSC) and Raman spectroscopy were used as analytical methods...... and indomethacin and simvastatin were chosen as the model compounds. These compounds partly converted from crystalline to disordered forms by milling. Partial least squares regression (PLS) was used to create calibration models for the XRPD and Raman data, which were subsequently used to quantify the milling......-induced crystalline disorder/ amorphousness under different process conditions. In the DSC measurements the change in heat capacity at the glass transition was used for quantification. Differently prepared amorphous indomethacin standards (prepared by either melt quench cooling or cryo milling) were compared...

  13. Composable Analytic Systems for next-generation intelligence analysis

    Science.gov (United States)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  14. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  15. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy

    Science.gov (United States)

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-03-01

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  16. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  17. On application of analytical transformation system using a computer for Feynman intearal calculation

    International Nuclear Information System (INIS)

    Gerdt, V.P.

    1978-01-01

    Various systems of analytic transformations for the calculation of Feynman integrals using computers are discussed. The hyperspheric technique Which is used to calculate Feynman integrals enables to perform angular integration for a set of diagrams, thus reducing the multiplicity of integral. All calculations based on this method are made with the ASHMEDAL program. Feynman integrals are calculated in Euclidean space using integration by parts and some differential identities. Analytic calculation of Feynman integral is performed by the MACSYMA system. Dispersion method of integral calculation is implemented in the SCHOONSCHIP system, calculations based on features of Nielsen function are made using efficient SINAC and RSIN programs. A tube of basic Feynman integral parameters calculated using the above techniques is given

  18. Analytical method used for intermediate products in continuous distillation of furfural

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.L.; Jia, M.; Wang, L.J.; Deng, Y.X.

    1981-01-01

    During distillation of furfural, analysis of main components in the crude furfural condensate and intermediate products is very important. Since furfural and methylfurfural are homologous and both furfural and acetone contain a carbonyl group, components in the sample must be separated before analysis. An improved analytical method has been studied, the accuracy and precision of which would meet the requirement of industrial standards. The analytical procedure was provided as follows: to determine the furfural content with gravimetric method of barbituric acid; to determine the methanol content with dichromate method after precipitating furfural and acetone, and distilling the liquid for analysis; and to determine the methylfurfural content with bromide-bromate method, which can be used only in the sample containing higher content of methylfurfural. For the sample in low content, the gas-liquid chromatographic method can be used. 7 references.

  19. Limitless Analytic Elements

    Science.gov (United States)

    Strack, O. D. L.

    2018-02-01

    We present equations for new limitless analytic line elements. These elements possess a virtually unlimited number of degrees of freedom. We apply these new limitless analytic elements to head-specified boundaries and to problems with inhomogeneities in hydraulic conductivity. Applications of these new analytic elements to practical problems involving head-specified boundaries require the solution of a very large number of equations. To make the new elements useful in practice, an efficient iterative scheme is required. We present an improved version of the scheme presented by Bandilla et al. (2007), based on the application of Cauchy integrals. The limitless analytic elements are useful when modeling strings of elements, rivers for example, where local conditions are difficult to model, e.g., when a well is close to a river. The solution of such problems is facilitated by increasing the order of the elements to obtain a good solution. This makes it unnecessary to resort to dividing the element in question into many smaller elements to obtain a satisfactory solution.

  20. Optimizing multi-pinhole SPECT geometries using an analytical model

    International Nuclear Information System (INIS)

    Rentmeester, M C M; Have, F van der; Beekman, F J

    2007-01-01

    State-of-the-art multi-pinhole SPECT devices allow for sub-mm resolution imaging of radio-molecule distributions in small laboratory animals. The optimization of multi-pinhole and detector geometries using simulations based on ray-tracing or Monte Carlo algorithms is time-consuming, particularly because many system parameters need to be varied. As an efficient alternative we develop a continuous analytical model of a pinhole SPECT system with a stationary detector set-up, which we apply to focused imaging of a mouse. The model assumes that the multi-pinhole collimator and the detector both have the shape of a spherical layer, and uses analytical expressions for effective pinhole diameters, sensitivity and spatial resolution. For fixed fields-of-view, a pinhole-diameter adapting feedback loop allows for the comparison of the system resolution of different systems at equal system sensitivity, and vice versa. The model predicts that (i) for optimal resolution or sensitivity the collimator layer with pinholes should be placed as closely as possible around the animal given a fixed detector layer, (ii) with high-resolution detectors a resolution improvement up to 31% can be achieved compared to optimized systems, (iii) high-resolution detectors can be placed close to the collimator without significant resolution losses, (iv) interestingly, systems with a physical pinhole diameter of 0 mm can have an excellent resolution when high-resolution detectors are used

  1. Magnetic anomaly depth and structural index estimation using different height analytic signals data

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian; Su, Chao

    2016-09-01

    This paper proposes a new semi-automatic inversion method for magnetic anomaly data interpretation that uses the combination of analytic signals of the anomaly at different heights to determine the depth and the structural index N of the sources. The new method utilizes analytic signals of the original anomaly at different height to effectively suppress the noise contained in the anomaly. Compared with the other high-order derivative calculation methods based on analytic signals, our method only computes first-order derivatives of the anomaly, which can be used to obtain more stable and accurate results. Tests on synthetic noise-free and noise-corrupted magnetic data indicate that the new method can estimate the depth and N efficiently. The technique is applied to a real measured magnetic anomaly in Southern Illinois caused by a known dike, and the result is in agreement with the drilling information and inversion results within acceptable calculation error.

  2. Post hoc subgroups in clinical trials: Anathema or analytics?

    Science.gov (United States)

    Weisberg, Herbert I; Pontes, Victor P

    2015-08-01

    There is currently much interest in generating more individualized estimates of treatment effects. However, traditional statistical methods are not well suited to this task. Post hoc subgroup analyses of clinical trials are fraught with methodological problems. We suggest that the alternative research paradigm of predictive analytics, widely used in many business contexts, can be adapted to help. We compare the statistical and analytics perspectives and suggest that predictive modeling should often replace subgroup analysis. We then introduce a new approach, cadit modeling, that can be useful to identify and test individualized causal effects. The cadit technique is particularly useful in the context of selecting from among a large number of potential predictors. We describe a new variable-selection algorithm that has been applied in conjunction with cadit. The cadit approach is illustrated through a reanalysis of data from the Randomized Aldactone Evaluation Study trial, which studied the efficacy of spironolactone in heart-failure patients. The trial was successful, but a serious adverse effect (hyperkalemia) was subsequently discovered. Our reanalysis suggests that it may be possible to predict the degree of hyperkalemia based on a logistic model and to identify a subgroup in which the effect is negligible. Cadit modeling is a promising alternative to subgroup analyses. Cadit regression is relatively straightforward to implement, generates results that are easy to present and explain, and can mesh straightforwardly with many variable-selection algorithms. © The Author(s) 2015.

  3. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    Science.gov (United States)

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  4. Leveraging data rich environments using marketing analytics

    NARCIS (Netherlands)

    Holtrop, Niels

    2017-01-01

    With the onset of what is popularly known as “big data”, increased attention is being paid to creating value from these data rich environments. Within the field of marketing, the analysis of customer and market data supported by models is known as marketing analytics. The goal of these analyses is

  5. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A Meta-Analytic Review of School-Based Prevention for Cannabis Use

    Science.gov (United States)

    Porath-Waller, Amy J.; Beasley, Erin; Beirness, Douglas J.

    2010-01-01

    This investigation used meta-analytic techniques to evaluate the effectiveness of school-based prevention programming in reducing cannabis use among youth aged 12 to 19. It summarized the results from 15 studies published in peer-reviewed journals since 1999 and identified features that influenced program effectiveness. The results from the set of…

  7. A Population-Level Data Analytics Portal for Self-Administered Lifestyle and Mental Health Screening.

    Science.gov (United States)

    Zhang, Xindi; Warren, Jim; Corter, Arden; Goodyear-Smith, Felicity

    2016-01-01

    This paper describes development of a prototype data analytics portal for analysis of accumulated screening results from eCHAT (electronic Case-finding and Help Assessment Tool). eCHAT allows individuals to conduct a self-administered lifestyle and mental health screening assessment, with usage to date chiefly in the context of primary care waiting rooms. The intention is for wide roll-out to primary care clinics, including secondary school based clinics, resulting in the accumulation of population-level data. Data from a field trial of eCHAT with sexual health questions tailored to youth were used to support design of a data analytics portal for population-level data. The design process included user personas and scenarios, screen prototyping and a simulator for generating large-scale data sets. The prototype demonstrates the promise of wide-scale self-administered screening data to support a range of users including practice managers, clinical directors and health policy analysts.

  8. On the Performance of Three In-Memory Data Systems for On Line Analytical Processing

    Directory of Open Access Journals (Sweden)

    Ionut HRUBARU

    2017-01-01

    Full Text Available In-memory database systems are among the most recent and most promising Big Data technologies, being developed and released either as brand new distributed systems or as extensions of old monolith (centralized database systems. As name suggests, in-memory systems cache all the data into special memory structures. Many are part of the NewSQL strand and target to bridge the gap between OLTP and OLAP into so-called Hybrid Transactional Analytical Systems (HTAP. This paper aims to test the performance of using such type of systems for TPCH analytical workloads. Performance is analyzed in terms of data loading, memory footprint and execution time of the TPCH query set for three in-memory data systems: Oracle, SQL Server and MemSQL. Tests are subsequently deployed on classical on-disk architectures and results compared to in-memory solutions. As in-memory is an enterprise edition feature, associated costs are also considered.

  9. Perfect imaging of three object points with only two analytic lens surfaces in two dimensions

    Science.gov (United States)

    Duerr, Fabian; Benítez, Pablo; Miñano, Juan Carlos; Meuret, Youri; Thienpont, Hugo

    2012-06-01

    In this work, a new two-dimensional analytic optics design method is presented that enables the coupling of three ray sets with two lens profiles. This method is particularly promising for optical systems designed for wide field of view and with clearly separated optical surfaces. However, this coupling can only be achieved if different ray sets will use different portions of the second lens profile. Based on a very basic example of a single thick lens, the Simultaneous Multiple Surfaces design method in two dimensions (SMS2D) will help to provide a better understanding of the practical implications on the design process by an increased lens thickness and a wider field of view. Fermat's principle is used to deduce a set of functional differential equations fully describing the entire optical system. The transformation of these functional differential equations into an algebraic linear system of equations allows the successive calculation of the Taylor series coefficients up to an arbitrary order. The evaluation of the solution space reveals the wide range of possible lens configurations covered by this analytic design method. Ray tracing analysis for calculated 20th order Taylor polynomials demonstrate excellent performance and the versatility of this new analytical optics design concept.

  10. Approximate Analytic Solutions for the Two-Phase Stefan Problem Using the Adomian Decomposition Method

    Directory of Open Access Journals (Sweden)

    Xiao-Ying Qin

    2014-01-01

    Full Text Available An Adomian decomposition method (ADM is applied to solve a two-phase Stefan problem that describes the pure metal solidification process. In contrast to traditional analytical methods, ADM avoids complex mathematical derivations and does not require coordinate transformation for elimination of the unknown moving boundary. Based on polynomial approximations for some known and unknown boundary functions, approximate analytic solutions for the model with undetermined coefficients are obtained using ADM. Substitution of these expressions into other equations and boundary conditions of the model generates some function identities with the undetermined coefficients. By determining these coefficients, approximate analytic solutions for the model are obtained. A concrete example of the solution shows that this method can easily be implemented in MATLAB and has a fast convergence rate. This is an efficient method for finding approximate analytic solutions for the Stefan and the inverse Stefan problems.

  11. Modeling of Coaxial Slot Waveguides Using Analytical and Numerical Approaches: Revisited

    Directory of Open Access Journals (Sweden)

    Kok Yeow You

    2012-01-01

    Full Text Available Our reviews of analytical methods and numerical methods for coaxial slot waveguides are presented. The theories, background, and physical principles related to frequency-domain electromagnetic equations for coaxial waveguides are reassessed. Comparisons of the accuracies of various types of admittance and impedance equations and numerical simulations are made, and the fringing field at the aperture sensor, which is represented by the lumped capacitance circuit, is evaluated. The accuracy and limitations of the analytical equations are explained in detail. The reasons for the replacement of analytical methods by numerical methods are outlined.

  12. The BTWorld use case for big data analytics : Description, MapReduce logical workflow, and empirical evaluation

    NARCIS (Netherlands)

    Hegeman, T.; Ghit, B.; Capota, M.; Hidders, A.J.H.; Epema, D.H.J.; Iosup, A.

    2013-01-01

    The commoditization of big data analytics, that is, the deployment, tuning, and future development of big data processing platforms such as MapReduce, relies on a thorough understanding of relevant use cases and workloads. In this work we propose BTWorld, a use case for time-based big data analytics

  13. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  14. Analytical review of modern herbal medicines used in musculoskeletal system diseases

    Directory of Open Access Journals (Sweden)

    Анна Ігорівна Крюкова

    2015-10-01

    Full Text Available Effective and safe treatment of the musculoskeletal system diseases is one of the main branches of medicine in general and rheumatology in particular. The relevance of this problem is caused mainly by the high incidence in the population, and temporary and permanent work disability status development in patients. The duration of rheumatologic diseases necessitates the optimal regimen selection, providing effective treatment and helping to prevent potential side effects associated with long-term use of remedies.Aim of research. The aim of our research was to perform an analytical review of modern herbal products registered in Ukraine and used for musculoskeletal system treatment. The drug analysis was made according to next parameters: producing country, manufacturer, dosage form, and the origin of remedies (natural or synthetic.Methods. Conventional analytical studies of electronic and paper sources were used for realization of the given problem.Results. As a result of the analytical review of modern herbal remedies registered in Ukraine and used for musculoskeletal system treatment, it was found that 20 trade names of drugs, more than 90% of which are homeopathic, are displayed on the pharmaceutical market. Concerning dosage forms, pills (38,5 %, injection solutions and oral drops (23,1 % and 11,5 %, respectively gain the biggest market share.Conclusion. It was found that imported drugs are widely available (80 % on the analyzed market segment, while local remedies gain rather minor market share (about 20 %.Among medicines of this group presented on Ukrainian market, imported homeopathic remedies gain the biggest share. Phytotheurapeutic drugs gain minor market share and have limited composition of natural active ingredients represented by the extracts of Harpagophytum procumbens, Apium graveolens, Salix alba, and Zingiber officinale

  15. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention

    Directory of Open Access Journals (Sweden)

    Samar Al-Hajj

    2017-09-01

    Full Text Available Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA. GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications.

  16. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    Science.gov (United States)

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  17. Analytic treatment of nonlinear evolution equations using first ...

    Indian Academy of Sciences (India)

    1. — journal of. July 2012 physics pp. 3–17. Analytic treatment of nonlinear evolution ... Eskisehir Osmangazi University, Art-Science Faculty, Department of Mathematics, ... (2.2) is integrated where integration constants are considered zeros.

  18. No Impact of the Analytical Method Used for Determining Cystatin C on Estimating Glomerular Filtration Rate in Children.

    Science.gov (United States)

    Alberer, Martin; Hoefele, Julia; Benz, Marcus R; Bökenkamp, Arend; Weber, Lutz T

    2017-01-01

    Measurement of inulin clearance is considered to be the gold standard for determining kidney function in children, but this method is time consuming and expensive. The glomerular filtration rate (GFR) is on the other hand easier to calculate by using various creatinine- and/or cystatin C (Cys C)-based formulas. However, for the determination of serum creatinine (Scr) and Cys C, different and non-interchangeable analytical methods exist. Given the fact that different analytical methods for the determination of creatinine and Cys C were used in order to validate existing GFR formulas, clinicians should be aware of the type used in their local laboratory. In this study, we compared GFR results calculated on the basis of different GFR formulas and either used Scr and Cys C values as determined by the analytical method originally employed for validation or values obtained by an alternative analytical method to evaluate any possible effects on the performance. Cys C values determined by means of an immunoturbidimetric assay were used for calculating the GFR using equations in which this analytical method had originally been used for validation. Additionally, these same values were then used in other GFR formulas that had originally been validated using a nephelometric immunoassay for determining Cys C. The effect of using either the compatible or the possibly incompatible analytical method for determining Cys C in the calculation of GFR was assessed in comparison with the GFR measured by creatinine clearance (CrCl). Unexpectedly, using GFR equations that employed Cys C values derived from a possibly incompatible analytical method did not result in a significant difference concerning the classification of patients as having normal or reduced GFR compared to the classification obtained on the basis of CrCl. Sensitivity and specificity were adequate. On the other hand, formulas using Cys C values derived from a compatible analytical method partly showed insufficient

  19. Post-analytical stability of 23 common chemistry and immunochemistry analytes in incurred samples

    DEFF Research Database (Denmark)

    Nielsen, Betina Klint; Frederiksen, Tina; Friis-Hansen, Lennart

    2017-01-01

    BACKGROUND: Storage of blood samples after centrifugation, decapping and initial sampling allows ordering of additional blood tests. The pre-analytic stability of biochemistry and immunochemistry analytes has been studied in detail, but little is known about the post-analytical stability...... in incurred samples. METHODS: We examined the stability of 23 routine analytes on the Dimension Vista® (Siemens Healthineers, Denmark): 42-60 routine samples in lithium-heparin gel tubes (Vacutainer, BD, USA) were centrifuged at 3000×g for 10min. Immediately after centrifugation, initial concentration...... of analytes were measured in duplicate (t=0). The tubes were stored decapped at room temperature and re-analyzed after 2, 4, 6, 8 and 10h in singletons. The concentration from reanalysis were normalized to initial concentration (t=0). Internal acceptance criteria for bias and total error were used...

  20. PROMISE: parallel-imaging and compressed-sensing reconstruction of multicontrast imaging using SharablE information.

    Science.gov (United States)

    Gong, Enhao; Huang, Feng; Ying, Kui; Wu, Wenchuan; Wang, Shi; Yuan, Chun

    2015-02-01

    A typical clinical MR examination includes multiple scans to acquire images with different contrasts for complementary diagnostic information. The multicontrast scheme requires long scanning time. The combination of partially parallel imaging and compressed sensing (CS-PPI) has been used to reconstruct accelerated scans. However, there are several unsolved problems in existing methods. The target of this work is to improve existing CS-PPI methods for multicontrast imaging, especially for two-dimensional imaging. If the same field of view is scanned in multicontrast imaging, there is significant amount of sharable information. It is proposed in this study to use manifold sharable information among multicontrast images to enhance CS-PPI in a sequential way. Coil sensitivity information and structure based adaptive regularization, which were extracted from previously reconstructed images, were applied to enhance the following reconstructions. The proposed method is called Parallel-imaging and compressed-sensing Reconstruction Of Multicontrast Imaging using SharablE information (PROMISE). Using L1 -SPIRiT as a CS-PPI example, results on multicontrast brain and carotid scans demonstrated that lower error level and better detail preservation can be achieved by exploiting manifold sharable information. Besides, the privilege of PROMISE still exists while there is interscan motion. Using the sharable information among multicontrast images can enhance CS-PPI with tolerance to motions. © 2014 Wiley Periodicals, Inc.

  1. Pre-analytical and analytical factors influencing Alzheimer's disease cerebrospinal fluid biomarker variability.

    Science.gov (United States)

    Fourier, Anthony; Portelius, Erik; Zetterberg, Henrik; Blennow, Kaj; Quadrio, Isabelle; Perret-Liaudet, Armand

    2015-09-20

    A panel of cerebrospinal fluid (CSF) biomarkers including total Tau (t-Tau), phosphorylated Tau protein at residue 181 (p-Tau) and β-amyloid peptides (Aβ42 and Aβ40), is frequently used as an aid in Alzheimer's disease (AD) diagnosis for young patients with cognitive impairment, for predicting prodromal AD in mild cognitive impairment (MCI) subjects, for AD discrimination in atypical clinical phenotypes and for inclusion/exclusion and stratification of patients in clinical trials. Due to variability in absolute levels between laboratories, there is no consensus on medical cut-off value for the CSF AD signature. Thus, for full implementation of this core AD biomarker panel in clinical routine, this issue has to be solved. Variability can be explained both by pre-analytical and analytical factors. For example, the plastic tubes used for CSF collection and storage, the lack of reference material and the variability of the analytical protocols were identified as important sources of variability. The aim of this review is to highlight these pre-analytical and analytical factors and describe efforts done to counteract them in order to establish cut-off values for core CSF AD biomarkers. This review will give the current state of recommendations. Copyright © 2015. Published by Elsevier B.V.

  2. Determination of Lineaments of the Sea of Marmara using Normalized Derivatives and Analytic Signals

    International Nuclear Information System (INIS)

    Oruc, B.

    2007-01-01

    The normalized derivatives and analytic signals calculated from magnetic anomaly map present useful results for the structural interpretation. The effectiveness of the methods on the solutions of lineaments has been tested for the edges of the thin-plate model. In the field data, magnetic anomaly map observed in the middle section of Marmara Sea has been used. The approximate solutions have been obtained for the lineaments of the area related in North Anatolia Fault from the characteristic images of the normalized derivatives and horizontal derivative analytic signals

  3. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    Science.gov (United States)

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Analytics: What We're Hearing

    Science.gov (United States)

    Oblinger, Diana

    2012-01-01

    Over the last few months, EDUCAUSE has been focusing on analytics. As people hear from experts, meet with association members, and watch the marketplace evolve, a number of common themes are emerging. Conversations have shifted from "What is analytics?" to "How do we get started, and how do we use analytics well?" What people are hearing from…

  5. Behavioural effects of advanced cruise control use : a meta-analytic approach.

    NARCIS (Netherlands)

    Dragutinovic, N. Brookhuis, K.A. Hagenzieker, M.P. & Marchau, V.A.W.J.

    2006-01-01

    In this study, a meta-analytic approach was used to analyse effects of Advanced Cruise Control (ACC) on driving behaviour reported in seven driving simulator studies. The effects of ACC on three consistent outcome measures, namely, driving speed, headway and driver workload have been analysed. The

  6. The use of different analytical techniques as a backup to mineral resources assessment

    International Nuclear Information System (INIS)

    Carvalho Tofani, P. de; Ferreira, M.P.; Gomes, H.; Avelar, M.M.

    1982-01-01

    The Empresas Nucleares Brasileiras S.A. (NUCLEBRAS) has implemented and improved, since their foundation in 1974, several laboratories at the Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), in Belo Horizonte (MG, Brazil), in order to develop capabilities in the analytical chemistry field. Skillful personnel, using a large spectrum of equipment and procedures, is already able to determine, fast and accurately, almost any chemical element in any matrix. About 340.000 analytical determinations have been performed during the last seven years, concerning mostly chemical elements of great importance in the mineral technology programs. This considerable amount of results has been used, specially, as a backup to assess Brazilian uranium resources. (Author) [pt

  7. Using learning analytics to evaluate a video-based lecture series.

    Science.gov (United States)

    Lau, K H Vincent; Farooque, Pue; Leydon, Gary; Schwartz, Michael L; Sadler, R Mark; Moeller, Jeremy J

    2018-01-01

    The video-based lecture (VBL), an important component of the flipped classroom (FC) and massive open online course (MOOC) approaches to medical education, has primarily been evaluated through direct learner feedback. Evaluation may be enhanced through learner analytics (LA) - analysis of quantitative audience usage data generated by video-sharing platforms. We applied LA to an experimental series of ten VBLs on electroencephalography (EEG) interpretation, uploaded to YouTube in the model of a publicly accessible MOOC. Trends in view count; total percentage of video viewed and audience retention (AR) (percentage of viewers watching at a time point compared to the initial total) were examined. The pattern of average AR decline was characterized using regression analysis, revealing a uniform linear decline in viewership for each video, with no evidence of an optimal VBL length. Segments with transient increases in AR corresponded to those focused on core concepts, indicative of content requiring more detailed evaluation. We propose a model for applying LA at four levels: global, series, video, and feedback. LA may be a useful tool in evaluating a VBL series. Our proposed model combines analytics data and learner self-report for comprehensive evaluation.

  8. Signals: Applying Academic Analytics

    Science.gov (United States)

    Arnold, Kimberly E.

    2010-01-01

    Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…

  9. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  10. Analytics for managers with Excel

    CERN Document Server

    Bell, Peter C

    2013-01-01

    Analytics is one of a number of terms which are used to describe a data-driven more scientific approach to management. Ability in analytics is an essential management skill: knowledge of data and analytics helps the manager to analyze decision situations, prevent problem situations from arising, identify new opportunities, and often enables many millions of dollars to be added to the bottom line for the organization.The objective of this book is to introduce analytics from the perspective of the general manager of a corporation. Rather than examine the details or attempt an encyclopaedic revie

  11. Analytical solutions for Dirac and Klein-Gordon equations using Backlund transformations

    Energy Technology Data Exchange (ETDEWEB)

    Zabadal, Jorge R.; Borges, Volnei, E-mail: jorge.zabadal@ufrgs.br, E-mail: borges@ufrgs.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Dept. de Engenharia Mecanica; Ribeiro, Vinicius G., E-mail: vinicius_ribeiro@uniritter.edu.br [Centro Universitario Ritter dos Reis (UNIRITTER), Porto Alegre, RS (Brazil); Santos, Marcio, E-mail: marciophd@gmail.com [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Centro de Estudos Interdisciplinares

    2015-07-01

    This work presents a new analytical method for solving Klein-Gordon type equations via Backlund transformations. The method consists in mapping the Klein-Gordon model into a first order system of partial differential equations, which contains a generalized velocity field instead of the Dirac matrices. This system is a tensor model for quantum field theory whose space solution is wider than the Dirac model in the original form. Thus, after finding analytical expressions for the wave functions, the Maxwell field can be readily obtained from the Dirac equations, furnishing a self-consistent field solution for the Maxwell-Dirac system. Analytical and numerical results are reported. (author)

  12. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    Science.gov (United States)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  13. Pre-analytical and analytical aspects affecting clinical reliability of plasma glucose results.

    Science.gov (United States)

    Pasqualetti, Sara; Braga, Federica; Panteghini, Mauro

    2017-07-01

    The measurement of plasma glucose (PG) plays a central role in recognizing disturbances in carbohydrate metabolism, with established decision limits that are globally accepted. This requires that PG results are reliable and unequivocally valid no matter where they are obtained. To control the pre-analytical variability of PG and prevent in vitro glycolysis, the use of citrate as rapidly effective glycolysis inhibitor has been proposed. However, the commercial availability of several tubes with studies showing different performance has created confusion among users. Moreover, and more importantly, studies have shown that tubes promptly inhibiting glycolysis give PG results that are significantly higher than tubes containing sodium fluoride only, used in the majority of studies generating the current PG cut-points, with a different clinical classification of subjects. From the analytical point of view, to be equivalent among different measuring systems, PG results should be traceable to a recognized higher-order reference via the implementation of an unbroken metrological hierarchy. In doing this, it is important that manufacturers of measuring systems consider the uncertainty accumulated through the different steps of the selected traceability chain. In particular, PG results should fulfil analytical performance specifications defined to fit the intended clinical application. Since PG has tight homeostatic control, its biological variability may be used to define these limits. Alternatively, given the central diagnostic role of the analyte, an outcome model showing the impact of analytical performance of test on clinical classifications of subjects can be used. Using these specifications, performance assessment studies employing commutable control materials with values assigned by reference procedure have shown that the quality of PG measurements is often far from desirable and that problems are exacerbated using point-of-care devices. Copyright © 2017 The Canadian

  14. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  15. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Determining passive cooling limits in CPV using an analytical thermal model

    Science.gov (United States)

    Gualdi, Federico; Arenas, Osvaldo; Vossier, Alexis; Dollet, Alain; Aimez, Vincent; Arès, Richard

    2013-09-01

    We propose an original thermal analytical model aiming to predict the practical limits of passive cooling systems for high concentration photovoltaic modules. The analytical model is described and validated by comparison with a commercial 3D finite element model. The limiting performances of flat plate cooling systems in natural convection are then derived and discussed.

  17. Detection of sensor failures in nuclear plants using analytic redundancy

    International Nuclear Information System (INIS)

    Kitamura, M.

    1980-01-01

    A method for on-line, nonperturbative detection and identification of sensor failures in nuclear power plants was studied to determine its feasibility. This method is called analytic redundancy, or functional redundancy. Sensor failure has traditionally been detected by comparing multiple signals from redundant sensors, such as in two-out-of-three logic. In analytic redundancy, with the help of an assumed model of the physical system, the signals from a set of sensors are processed to reproduce the signals from all system sensors

  18. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  19. An analytical method for calculating stresses and strains of ATF cladding based on thick walled theory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Hyun; Kim, Hak Sung [Hanyang University, Seoul (Korea, Republic of); Kim, Hyo Chan; Yang, Yong Sik; In, Wang kee [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this paper, an analytical method based on thick walled theory has been studied to calculate stress and strain of ATF cladding. In order to prescribe boundary conditions of the analytical method, two algorithms were employed which are called subroutine 'Cladf' and 'Couple' of FRACAS, respectively. To evaluate the developed method, equivalent model using finite element method was established and stress components of the method were compared with those of equivalent FE model. One of promising ATF concepts is the coated cladding, which take advantages such as high melting point, a high neutron economy, and low tritium permeation rate. To evaluate the mechanical behavior and performance of the coated cladding, we need to develop the specified model to simulate the ATF behaviors in the reactor. In particular, the model for simulation of stress and strain for the coated cladding should be developed because the previous model, which is 'FRACAS', is for one body model. The FRACAS module employs the analytical method based on thin walled theory. According to thin-walled theory, radial stress is defined as zero but this assumption is not suitable for ATF cladding because value of the radial stress is not negligible in the case of ATF cladding. Recently, a structural model for multi-layered ceramic cylinders based on thick-walled theory was developed. Also, FE-based numerical simulation such as BISON has been developed to evaluate fuel performance. An analytical method that calculates stress components of ATF cladding was developed in this study. Thick-walled theory was used to derive equations for calculating stress and strain. To solve for these equations, boundary and loading conditions were obtained by subroutine 'Cladf' and 'Couple' and applied to the analytical method. To evaluate the developed method, equivalent FE model was established and its results were compared to those of analytical model. Based on the

  20. Understanding Customer Product Choices: A Case Study Using the Analytical Hierarchy Process

    Science.gov (United States)

    Robert L. Smith; Robert J. Bush; Daniel L. Schmoldt

    1996-01-01

    The Analytical Hierarchy Process (AHP) was used to characterize the bridge material selection decisions of highway officials across the United States. Understanding product choices by utilizing the AHP allowed us to develop strategies for increasing the use of timber in bridge construction. State Department of Transportation engineers, private consulting engineers, and...

  1. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    oljić, I. Eškinja, M. Kaštelan-Macan, I. Piljac. Š. Cerjan-Stefanović and others translated Chromatographic nomenclature (IUPAC Compendium of Analytical Nomenclature. The related area is covered by books of V. Grdinić and F. Plavšić.During the project Croatian nomenclature of analytical chemistry there shall be an analysis of dictionaries, textbooks, handbooks, professional and scientific monographs and articles, official governmental and economic publications, regulations and instructions. The Compendium of Analytical Nomenclature is expected to have been translated and the translation mostly adjusted to the Croatian language standard. EUROLAB and EURACHEM documents related to quality assurance in analytical laboratories, especially in research and development have not yet been included in the Compendium, and due to the globalization of the information and service market, such documents need to be adjusted to the Croatian language standard in collaboration with consultants from the Institute for Croatian Language and Lingiustics. The terms shall be sorted according to the analytical process from sampling to final information.It is expected that the project's results shall be adopted by the Croatian scientific and professional community, so as to raise the awareness of the necessity of using Croatian terms in everyday professional communication and particularly in scientific and educational work. The Croatian language is rich enough for all analytical terms to be translated appropriately. This shall complete the work our predecessors began several times. We face a great challenge of contributing to the creation of the Croatian scientific terminology and believe we shall succeed.

  2. Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning.

    Science.gov (United States)

    D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela

    2017-01-01

    It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in embodied theories of cognition and affect, which advocate a close coupling between thought and action. It uses machine-learned computational models to automatically infer mental states associated with engagement (e.g., interest, flow) from machine-readable behavioral and physiological signals (e.g., facial expressions, eye tracking, click-stream data) and from aspects of the environmental context. We present15 case studies that illustrate the potential of the AAA approach for measuring engagement in digital learning environments. We discuss strengths and weaknesses of the AAA approach, concluding that it has significant promise to catalyze engagement research.

  3. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    Science.gov (United States)

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Navigating the Benford Labyrinth: A big-data analytic protocol illustrated using the academic library context

    Directory of Open Access Journals (Sweden)

    Michael Halperin

    2016-03-01

    Full Text Available Objective: Big Data Analytics is a panoply of techniques the principal intention of which is to ferret out dimensions or factors from certain data streamed or available over the WWW. We offer a subset or “second” stage protocol of Big Data Analytics (BDA that uses these dimensional datasets as benchmarks for profiling related data. We call this Specific Context Benchmarking (SCB. Method: In effecting this benchmarking objective, we have elected to use a Digital Frequency Profiling (DFP technique based upon the work of Newcomb and Benford, who have developed a profiling benchmark based upon the Log10 function. We illustrate the various stages of the SCB protocol using the data produced by the Academic Research Libraries to enhance insights regarding the details of the operational benchmarking context and so offer generalizations needed to encourage adoption of SCB across other functional domains. Results: An illustration of the SCB protocol is offered using the recently developed Benford Practical Profile as the Conformity Benchmarking Measure. ShareWare: We have developed a Decision Support System called: SpecificContextAnalytics (SCA:DSS to create the various information sets presented in this paper. The SCA:DSS, programmed in Excel VBA, is available from the corresponding author as a free download without restriction to its use. Conclusions: We note that SCB effected using the DFPs is an enhancement not a replacement for the usual statistical and analytic techniques and fits very well in the BDA milieu.

  5. Analytical, antioxidant and hepatoprotective studies on extracts of oxalis corniculata linn

    International Nuclear Information System (INIS)

    Hussain, I.S.; Islam, M.; Khan, M.T.

    2014-01-01

    Despite a number of traditional medicinal uses and pharmacological properties, the usefulness of Oxalis corniculata Linn. (Family: Oxalidaceae) is a question due to its high oxalate content which can form insoluble salts with physiological calcium. Therefore, the present study aimed to reduce oxalates in extracts and to investigate such extracts chemically and biologically. The extraction was carried out using different solvents and methods, and analytical studies of extracts indicated that oxalate contents decrease on drying the material. Furthermore, extraction of both fresh and dried materials using 1 % aqueous calcium chloride and ferric chloride solutions resulted in lowering oxalate contents. Methanolic extracts of stems and leaves having lower oxalates were obtained using sequential extraction, which showed good in vitro antioxidant activity - by DPPH and beta-carotene linoleate models - and in vivo hepatoprotective activity in isoniazid and rifampicin-induced oxidative stressed rats (P < 0.05). It was concluded that methanolic extracts of leaves and stems of Oxalis corniculata had lower oxalates and showed promising antioxidant and hepatoprotective activities. (author)

  6. Analytical Evaluation of Beam Deformation Problem Using Approximate Methods

    DEFF Research Database (Denmark)

    Barari, Amin; Kimiaeifar, A.; Domairry, G.

    2010-01-01

    The beam deformation equation has very wide applications in structural engineering. As a differential equation, it has its own problem concerning existence, uniqueness and methods of solutions. Often, original forms of governing differential equations used in engineering problems are simplified......, and this process produces noise in the obtained answers. This paper deals with the solution of second order of differential equation governing beam deformation using four analytical approximate methods, namely the Perturbation, Homotopy Perturbation Method (HPM), Homotopy Analysis Method (HAM) and Variational...... Iteration Method (VIM). The comparisons of the results reveal that these methods are very effective, convenient and quite accurate for systems of non-linear differential equation....

  7. An Analytic Glossary to Social Inquiry Using Institutional and Political Activist Ethnography

    Directory of Open Access Journals (Sweden)

    Laura Bisaillon PhD

    2012-12-01

    Full Text Available This analytic glossary, composed of 52 terms, is a practical reference and working tool for persons preparing to conduct theoretically informed qualitative social science research drawing from institutional and political activist ethnography. Researchers using these approaches examine social problems and move beyond interpretation by explicating how these problems are organized and what social and ruling relations coordinate them. Political activist ethnography emerges from, and extends, institutional ethnography by producing knowledge explicitly for activism and social movement organizing ends. The assemblage of vocabulary and ideas in this word list are new, and build on existing methodological resources. This glossary offers an extensive, analytic, and challenging inventory of language that brings together terms from these ethnographic approaches with shared ancestry. This compilation is designed to serve as an accessible “one-stop-shop” resource for persons using or contemplating using institutional and political activist ethnography in their research and/or activist projects.

  8. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  9. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  10. Valley detection using a graphene gradual pn junction with spin–orbit coupling: An analytical conductance calculation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Mou, E-mail: yang.mou@hotmail.com [Guangdong Provincial Key Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering, South China Normal University, Guangzhou 510006 (China); Wang, Rui-Qiang [Guangdong Provincial Key Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering, South China Normal University, Guangzhou 510006 (China); Bai, Yan-Kui [College of Physical Science and Information Engineering and Hebei Advance Thin Films Laboratory, Hebei Normal University, Shijiazhuang, Hebei 050024 (China)

    2015-09-04

    Graphene pn junction is the brick to build up variety of graphene nano-structures. The analytical formula of the conductance of graphene gradual pn junctions in the whole bipolar region has been absent up to now. In this paper, we analytically calculated that pn conductance with the spin–orbit coupling and stagger potential taken into account. Our analytical expression indicates that the energy gap causes the conductance to drop a constant value with respect to that without gap in a certain parameter region, and manifests that the curve of the conductance versus the stagger potential consists of two Gaussian peaks – one valley contributes one peak. The latter feature allows one to detect the valley polarization without using double-interface resonant devices. - Highlights: • Analytical conductance formula of the gradual graphene pn junction with spin–orbit coupling in the whole bipolar region. • Exploring the valley-dependent transport of gradual graphene pn junctions analytically. • Conductance peak without resonance.

  11. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  12. Recent analytical applications of magnetic nanoparticles

    Directory of Open Access Journals (Sweden)

    Mohammad Faraji

    2016-07-01

    Full Text Available Analytical chemistry has experienced, as well as other areas of science, a big change due to the needs and opportunities provided by analytical nanoscience and nanotechnology. Now, nanotechnology is increasingly proving to be a powerful ally of analytical chemistry to achieve its objectives, and to simplify analytical processes. Moreover, the information needs arising from the growing nanotechnological activity are opening an exciting new field of action for analytical chemists. Magnetic nanoparticles have been used in various fields owing to their unique properties including large specific surface area and simple separation with magnetic fields. For Analytical applications, they have been used mainly for sample preparation techniques (magnetic solid phase extraction with different advanced functional groups (layered double hydroxide, β-cyclodextrin, carbon nanotube, graphen, polymer, octadecylsilane and automation of it, microextraction techniques enantioseparation and chemosensors. This review summarizes the basic principles and achievements of magnetic nanoparticles in sample preparation techniques, enantioseparation and chemosensors. Also, some selected articles recently published (2010-2016 have been reviewed and discussed.

  13. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  14. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test.

    Science.gov (United States)

    Yang, Hannah P; Walmer, David K; Merisier, Delson; Gage, Julia C; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S; Castle, Philip E

    2011-09-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings ("triage test"). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kappa = 0.87). DNA sequencing on a subset of 16 HPV16/18/45 positive and 16 HPV16/18/45 negative verified the analytic specificity of the research test. It is concluded that the HPV16/18/45 assay is a promising triage test with a minimum detection of approximately 5000 viral copies, the clinically relevant threshold. Published by Elsevier B.V.

  15. Analytic American Option Pricing and Applications

    NARCIS (Netherlands)

    Sbuelz, A.

    2003-01-01

    I use a convenient value breakdown in order to obtain analytic solutions for finitematurity American option prices.Such a barrier-option-based breakdown yields an analytic lower bound for the American option price, which is as price-tight as the Barone-Adesi and Whaley (1987) analytic value proxy

  16. 7 CFR 94.103 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...

  17. Using the Analytic Hierarchy Process for Decision-Making in Ecosystem Management

    Science.gov (United States)

    Daniel L. Schmoldt; David L. Peterson

    1997-01-01

    Land management activities on public lands combine multiple objectives in order to create a plan of action over a finite time horizon. Because management activities are constrained by time and money, it is critical to make the best use of available agency resources. The Analytic Hierarchy Process (AHP) offers a structure for multi-objective decisionmaking so that...

  18. 7 CFR 98.4 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...

  19. Incident detection and isolation in drilling using analytical redundancy relations

    DEFF Research Database (Denmark)

    Willersrud, Anders; Blanke, Mogens; Imsland, Lars

    2015-01-01

    must be avoided. This paper employs model-based diagnosis using analytical redundancy relations to obtain residuals which are affected differently by the different incidents. Residuals are found to be non-Gaussian - they follow a multivariate t-distribution - hence, a dedicated generalized likelihood...... measurements available. In the latter case, isolation capability is shown to be reduced to group-wise isolation, but the method would still detect all serious events with the prescribed false alarm probability...

  20. Developments in analytical instrumentation

    Science.gov (United States)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  1. USING THE ANALYTICAL HIERARCHY PROCESS TO SUPPORT SUSTAINABLE USE OF GEO-RESOURCES IN METROPOLITAN AREAS

    Institute of Scientific and Technical Information of China (English)

    Oswald MARINONI; Andreas HOPPE

    2006-01-01

    Sand and gravel are important raw materials which are needed for many civil engineering projects.Due to economic reasons, sand and gravelpits are frequently located in the periphery of metropolitan areas which are often subject to competing land-use interests. As a contribution to land-use conflict solving, the Analytic Hierarchy Process (AHP) is applied within a Geographic Information System (GIS) environment. Two AHP preference matrix scenario constellations are evaluated and their results are used to create a land-use conflict map.

  2. 7 CFR 94.4 - Analytical methods.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.4 Section 94.4 Agriculture... POULTRY AND EGG PRODUCTS Mandatory Analyses of Egg Products § 94.4 Analytical methods. The majority of analytical methods used by the USDA laboratories to perform mandatory analyses for egg products are listed as...

  3. Social Data Analytics Using Tensors and Sparse Techniques

    Science.gov (United States)

    Zhang, Miao

    2014-01-01

    The development of internet and mobile technologies is driving an earthshaking social media revolution. They bring the internet world a huge amount of social media content, such as images, videos, comments, etc. Those massive media content and complicate social structures require the analytic expertise to transform those flood of information into…

  4. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Relativistic quantum mechanic calculation of photoionization cross-section of hydrogenic and non-hydrogenic states using analytical potentials

    International Nuclear Information System (INIS)

    Rodriguez, R.; Gil, J.M.; Rubiano, J.G.; Florido, R.; Martel, P.; Minguez, E.

    2005-01-01

    Photoionization process is a subject of special importance in many areas of physics. Numerical methods must be used in order to obtain photoionization cross-sections for non-hydrogenic levels. The atomic data required to calculate them is huge so self-consistent calculations increase computing time considerably. Analytical potentials are a useful alternative because they avoid the iterative procedures typical in self-consistent models. In this work, we present a relativistic quantum calculation of photoionization cross-sections for isolated ions based on an analytical potential to obtain the required atomic data, which is valid both for hydrogenic and non-hydrogenic ions. Comparisons between our results and others obtained using either widely used analytical expressions for the cross-sections or more sophisticated calculations are done

  6. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes

    Science.gov (United States)

    Qiu, Chenchen; Li, Yande

    2017-01-01

    China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can’t have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics) materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis. PMID:28771496

  7. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes.

    Directory of Open Access Journals (Sweden)

    Yang Shen

    Full Text Available China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can't have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis.

  8. Big data analytics for the virtual network topology reconfiguration use case

    OpenAIRE

    Gifre Renom, Lluís; Morales Alcaide, Fernando; Velasco Esteban, Luis Domingo; Ruiz Ramírez, Marc

    2016-01-01

    ABNO's OAM Handler is extended with big data analytics capabilities to anticipate traffic changes in volume and direction. Predicted traffic is used to trigger virtual network topology re-optimization. When the virtual topology needs to be reconfigured, predicted and current traffic matrices are used to find the optimal topology. A heuristic algorithm to adapt current virtual topology to meet both actual demands and expected traffic matrix is proposed. Experimental assessment is carried ou...

  9. Realizing the promises of marine biotechnology

    NARCIS (Netherlands)

    Luiten, EEM; Akkerman, [No Value; Koulman, A; Kamermans, P; Reith, H; Barbosa, MJ; Sipkema, D; Wijffels, RH

    High-quality research in the field of marine biotechnology is one of the key-factors for successful innovation in exploiting the vast diversity of marine life. However, fascinating scientific research with promising results and claims on promising potential applications (e.g. for pharmaceuticals,

  10. Realizing the promises of marine biotechnology

    NARCIS (Netherlands)

    Luiten, E.E.M.; Akkerman, I.; Koulman, A.; Kamermans, P.; Reith, H.; Barbosa, M.J.; Sipkema, D.; Wijffels, R.H.

    2003-01-01

    High-quality research in the field of marine biotechnology is one of the key-factors for successful innovation in exploiting the vast diversity of marine life. However, fascinating scientific research with promising results and claims on promising potential applications (e.g. for pharmaceuticals,

  11. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    Science.gov (United States)

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  12. A method for determining the analytical form of a radionuclide depth distribution using multiple gamma spectrometry measurements

    Energy Technology Data Exchange (ETDEWEB)

    Dewey, Steven Clifford, E-mail: sdewey001@gmail.com [United States Air Force School of Aerospace Medicine, Occupational Environmental Health Division, Health Physics Branch, Radiation Analysis Laboratories, 2350 Gillingham Drive, Brooks City-Base, TX 78235 (United States); Whetstone, Zachary David, E-mail: zacwhets@umich.edu [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States); Kearfott, Kimberlee Jane, E-mail: kearfott@umich.edu [Radiological Health Engineering Laboratory, Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, 1906 Cooley Building, Ann Arbor, MI 48109-2104 (United States)

    2011-06-15

    When characterizing environmental radioactivity, whether in the soil or within concrete building structures undergoing remediation or decommissioning, it is highly desirable to know the radionuclide depth distribution. This is typically modeled using continuous analytical expressions, whose forms are believed to best represent the true source distributions. In situ gamma ray spectroscopic measurements are combined with these models to fully describe the source. Currently, the choice of analytical expressions is based upon prior experimental core sampling results at similar locations, any known site history, or radionuclide transport models. This paper presents a method, employing multiple in situ measurements at a single site, for determining the analytical form that best represents the true depth distribution present. The measurements can be made using a variety of geometries, each of which has a different sensitivity variation with source spatial distribution. Using non-linear least squares numerical optimization methods, the results can be fit to a collection of analytical models and the parameters of each model determined. The analytical expression that results in the fit with the lowest residual is selected as the most accurate representation. A cursory examination is made of the effects of measurement errors on the method. - Highlights: > A new method for determining radionuclide distribution as a function of depth is presented. > Multiple measurements are used, with enough measurements to determine the unknowns in analytical functions that might describe the distribution. > The measurements must be as independent as possible, which is achieved through special collimation of the detector. > Although the effects of measurements errors may be significant on the results, an improvement over other methods is anticipated.

  13. Antioxidant phytochemicals in fresh produce: exploitation of genotype variation and advancements in analytical protocols

    Science.gov (United States)

    Manganaris, George A.; Goulas, Vlasios; Mellidou, Ifigeneia; Drogoudi, Pavlina

    2017-12-01

    Horticultural commodities (fruit and vegetables) are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, towards the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivar/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e. small tomato of Santorini island (cv. ‘Tomataki Santorinis’) possesses appreciably high amounts of ascorbic acid. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier ‘gene pool’ as the basis of future adaptation. Towards this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR) and infrared (IR) spectroscopy, electrochemical and chemometric methods, flow injection analysis (FIA), optical sensors and high resolution screening (HRS). Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e. metabolomics, foodomics) is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of cheap and rapid optical sensors and IR spectroscopy is recommended to

  14. Antioxidant Phytochemicals in Fresh Produce: Exploitation of Genotype Variation and Advancements in Analytical Protocols

    Directory of Open Access Journals (Sweden)

    George A. Manganaris

    2018-02-01

    Full Text Available Horticultural commodities (fruit and vegetables are the major dietary source of several bioactive compounds of high nutraceutical value for humans, including polyphenols, carotenoids and vitamins. The aim of the current review was dual. Firstly, toward the eventual enhancement of horticultural crops with bio-functional compounds, the natural genetic variation in antioxidants found in different species and cultivars/genotypes is underlined. Notably, some landraces and/or traditional cultivars have been characterized by substantially higher phytochemical content, i.e., small tomato of Santorini island (cv. “Tomataki Santorinis” possesses appreciably high amounts of ascorbic acid (AsA. The systematic screening of key bioactive compounds in a wide range of germplasm for the identification of promising genotypes and the restoration of key gene fractions from wild species and landraces may help in reducing the loss of agro-biodiversity, creating a healthier “gene pool” as the basis of future adaptation. Toward this direction, large scale comparative studies in different cultivars/genotypes of a given species provide useful insights about the ones of higher nutritional value. Secondly, the advancements in the employment of analytical techniques to determine the antioxidant potential through a convenient, easy and fast way are outlined. Such analytical techniques include electron paramagnetic resonance (EPR and infrared (IR spectroscopy, electrochemical, and chemometric methods, flow injection analysis (FIA, optical sensors, and high resolution screening (HRS. Taking into consideration that fruits and vegetables are complex mixtures of water- and lipid-soluble antioxidants, the exploitation of chemometrics to develop “omics” platforms (i.e., metabolomics, foodomics is a promising tool for researchers to decode and/or predict antioxidant activity of fresh produce. For industry, the use of optical sensors and IR spectroscopy is recommended to

  15. SU-E-T-479: Development and Validation of Analytical Models Predicting Secondary Neutron Radiation in Proton Therapy Applications

    International Nuclear Information System (INIS)

    Farah, J; Bonfrate, A; Donadille, L; Martinetti, F; Trompier, F; Clairand, I; De Olivera, A; Delacroix, S; Herault, J; Piau, S; Vabre, I

    2014-01-01

    Purpose: Test and validation of analytical models predicting leakage neutron exposure in passively scattered proton therapy. Methods: Taking inspiration from the literature, this work attempts to build an analytical model predicting neutron ambient dose equivalents, H*(10), within the local 75 MeV ocular proton therapy facility. MC simulations were first used to model H*(10) in the beam axis plane while considering a closed final collimator and pristine Bragg peak delivery. Next, MC-based analytical model was tested against simulation results and experimental measurements. The model was also expended in the vertical direction to enable a full 3D mapping of H*(10) inside the treatment room. Finally, the work focused on upgrading the literature model to clinically relevant configurations considering modulated beams, open collimators, patient-induced neutron fluctuations, etc. Results: The MC-based analytical model efficiently reproduced simulated H*(10) values with a maximum difference below 10%. In addition, it succeeded in predicting measured H*(10) values with differences <40%. The highest differences were registered at the closest and farthest positions from isocenter where the analytical model failed to faithfully reproduce the high neutron fluence and energy variations. The differences remains however acceptable taking into account the high measurement/simulation uncertainties and the end use of this model, i.e. radiation protection. Moreover, the model was successfully (differences < 20% on simulations and < 45% on measurements) extended to predict neutrons in the vertical direction with respect to the beam line as patients are in the upright seated position during ocular treatments. Accounting for the impact of beam modulation, collimation and the present of a patient in the beam path is far more challenging and conversion coefficients are currently being defined to predict stray neutrons in clinically representative treatment configurations. Conclusion

  16. Political Reputations and Campaign Promises

    OpenAIRE

    Aragones, Enriqueta; Palfrey, Thomas R.; Postlewaite, Andrew

    2006-01-01

    We analyze conditions under which candidates' reputations may affect voters' beliefs over what policy will be implemented by the winning candidate of an election. We develop a model of repeated elections with complete information in which candidates are purely ideological. We analyze an equilibrium in which voters' strategies involve a credible threat to punish candidates who renege on their campaign promises and in which all campaign promises are believed by voters and honored by candidates....

  17. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    Science.gov (United States)

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  18. Consistent constitutive modeling of metallic target penetration using empirical, analytical, and numerical penetration models

    Directory of Open Access Journals (Sweden)

    John (Jack P. Riegel III

    2016-04-01

    Full Text Available Historically, there has been little correlation between the material properties used in (1 empirical formulae, (2 analytical formulations, and (3 numerical models. The various regressions and models may each provide excellent agreement for the depth of penetration into semi-infinite targets. But the input parameters for the empirically based procedures may have little in common with either the analytical model or the numerical model. This paper builds on previous work by Riegel and Anderson (2014 to show how the Effective Flow Stress (EFS strength model, based on empirical data, can be used as the average flow stress in the analytical Walker–Anderson Penetration model (WAPEN (Anderson and Walker, 1991 and how the same value may be utilized as an effective von Mises yield strength in numerical hydrocode simulations to predict the depth of penetration for eroding projectiles at impact velocities in the mechanical response regime of the materials. The method has the benefit of allowing the three techniques (empirical, analytical, and numerical to work in tandem. The empirical method can be used for many shot line calculations, but more advanced analytical or numerical models can be employed when necessary to address specific geometries such as edge effects or layering that are not treated by the simpler methods. Developing complete constitutive relationships for a material can be costly. If the only concern is depth of penetration, such a level of detail may not be required. The effective flow stress can be determined from a small set of depth of penetration experiments in many cases, especially for long penetrators such as the L/D = 10 ones considered here, making it a very practical approach. In the process of performing this effort, the authors considered numerical simulations by other researchers based on the same set of experimental data that the authors used for their empirical and analytical assessment. The goals were to establish a

  19. Solution of the isotopic depletion equation using decomposition method and analytical solution

    Energy Technology Data Exchange (ETDEWEB)

    Prata, Fabiano S.; Silva, Fernando C.; Martinez, Aquilino S., E-mail: fprata@con.ufrj.br, E-mail: fernando@con.ufrj.br, E-mail: aquilino@lmp.ufrj.br [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    In this paper an analytical calculation of the isotopic depletion equations is proposed, featuring a chain of major isotopes found in a typical PWR reactor. Part of this chain allows feedback reactions of (n,2n) type. The method is based on decoupling the equations describing feedback from the rest of the chain by using the decomposition method, with analytical solutions for the other isotopes present in the chain. The method was implemented in a PWR reactor simulation code, that makes use of the nodal expansion method (NEM) to solve the neutron diffusion equation, describing the spatial distribution of neutron flux inside the reactor core. Because isotopic depletion calculation module is the most computationally intensive process within simulation systems of nuclear reactor core, it is justified to look for a method that is both efficient and fast, with the objective of evaluating a larger number of core configurations in a short amount of time. (author)

  20. Analytical quality assurance in laboratories using tracers for biological and environmental studies

    International Nuclear Information System (INIS)

    Melaj, Mariana; Martin, Olga; Lopez, Silvia; Rojas de Tramontini, Susana

    1999-01-01

    This work describe the way we are organizing a quality assurance system to apply in the analytical measurements of the relation 14 N/ 15 N in biological and soil material. The relation 14 / 15 is measured with a optic emission spectrometer (NOI6PC), which distinguish the differences in wave length of electromagnetic radiation emitted by N-28, N-29 and N-30. The major problem is the 'cross contamination' of samples with different enrichments. The elements that are been considered to reach satisfactory analytical results are: 1) A proper working area; 2) The samples must be homogeneous and the samples must represent the whole sampled system; 3) The use of reference materials. In each digestion, a known reference sample must be added; 4) Adequate equipment operation; 5) Standard operating procedures; 6) Control charts, laboratory and equipment books. All operations using the equipment is registered in a book; 7) Training of the operators. (author)

  1. Solution of the isotopic depletion equation using decomposition method and analytical solution

    International Nuclear Information System (INIS)

    Prata, Fabiano S.; Silva, Fernando C.; Martinez, Aquilino S.

    2011-01-01

    In this paper an analytical calculation of the isotopic depletion equations is proposed, featuring a chain of major isotopes found in a typical PWR reactor. Part of this chain allows feedback reactions of (n,2n) type. The method is based on decoupling the equations describing feedback from the rest of the chain by using the decomposition method, with analytical solutions for the other isotopes present in the chain. The method was implemented in a PWR reactor simulation code, that makes use of the nodal expansion method (NEM) to solve the neutron diffusion equation, describing the spatial distribution of neutron flux inside the reactor core. Because isotopic depletion calculation module is the most computationally intensive process within simulation systems of nuclear reactor core, it is justified to look for a method that is both efficient and fast, with the objective of evaluating a larger number of core configurations in a short amount of time. (author)

  2. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  3. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  4. Simulation of an Electromagnetic Acoustic Transducer Array by Using Analytical Method and FDTD

    Directory of Open Access Journals (Sweden)

    Yuedong Xie

    2016-01-01

    Full Text Available Previously, we developed a method based on FEM and FDTD for the study of an Electromagnetic Acoustic Transducer Array (EMAT. This paper presents a new analytical solution to the eddy current problem for the meander coil used in an EMAT, which is adapted from the classic Deeds and Dodd solution originally intended for circular coils. The analytical solution resulting from this novel adaptation exploits the large radius extrapolation and shows several advantages over the finite element method (FEM, especially in the higher frequency regime. The calculated Lorentz force density from the analytical EM solver is then coupled to the ultrasonic simulations, which exploit the finite-difference time-domain (FDTD method to describe the propagation of ultrasound waves, in particular for Rayleigh waves. Radiation pattern obtained with Hilbert transform on time-domain waveforms is proposed to characterise the sensor in terms of its beam directivity and field distribution along the steering angle, which can produce performance parameters for an EMAT array, facilitating the optimum design of such sensors.

  5. Applied research on air pollution using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    1994-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  6. Use of reference materials for validating analytical methods. Applied to the determination of As, Co, Na, Hg, Se and Fe using neutron activation analysis

    International Nuclear Information System (INIS)

    Munoz, L; Andonie, O; Kohnenkamp, I

    2000-01-01

    The main purpose of an analytical laboratory is to provide reliable information on the nature and composition of the materials submitted for analysis. This purpose can only be attained if analytical methodologies that have the attributes of accuracy, precision, specificity and sensitivity, among others, are used. The process by which these attributes are evaluated is called validation of the analytical method. The Chilean Nuclear Energy Commission's Neutron Activation Analysis Laboratory is applying a quality guarantee program to ensure the quality of its analytical results, which aims, as well, to attain accreditation for some of its measurements. Validation of the analytical methodologies used is an essential part of applying this program. There are many forms of validation, from comparison with reference techniques to participation in inter-comparison rounds. Certified reference materials were used in this work in order to validate the application of neutron activation analysis in determining As, Co, Na, Hg, Se and Fe in shellfish samples. The use of reference materials was chosen because it is a simple option that easily detects sources of systematic errors. Neutron activation analysis is an instrumental analytical method that does not need chemical treatment and that is based on processes which take place in the nuclei of atoms, making the matrix effects unimportant and different biological reference materials can be used. The following certified reference materials were used for validating the method used: BCR human hair 397, NRCC dogfish muscle DORM-2, NRCC -dogfish liver DOLT-2, NIST - oyster tissue 1566, NIES - mussel 6 and BCR - tuna fish 464. The reference materials were analyzed using the procedure developed for the shellfish samples and the above-mentioned elements were determined. With the results obtained, the parameters of accuracy, precision, detection limit, quantification limit and uncertainty associated with the method were determined for each

  7. Edge detection of magnetic anomalies using analytic signal of tilt angle (ASTA)

    Science.gov (United States)

    Alamdar, K.; Ansari, A. H.; Ghorbani, A.

    2009-04-01

    Magnetic is a commonly used geophysical technique to identify and image potential subsurface targets. Interpretation of magnetic anomalies is a complex process due to the superposition of multiple magnetic sources, presence of geologic and cultural noise and acquisition and positioning error. Both the vertical and horizontal derivatives of potential field data are useful; horizontal derivative, enhance edges whereas vertical derivative narrow the width of anomaly and so locate source bodies more accurately. We can combine vertical and horizontal derivative of magnetic field to achieve analytic signal which is independent to body magnetization direction and maximum value of this lies over edges of body directly. Tilt angle filter is phased-base filter and is defined as angle between vertical derivative and total horizontal derivative. Tilt angle value differ from +90 degree to -90 degree and its zero value lies over body edge. One of disadvantage of this filter is when encountering with deep sources the detected edge is blurred. For overcome this problem many authors introduced new filters such as total horizontal derivative of tilt angle or vertical derivative of tilt angle which Because of using high-order derivative in these filters results may be too noisy. If we combine analytic signal and tilt angle, a new filter termed (ASTA) is produced which its maximum value lies directly over body edge and is easer than tilt angle to delineate body edge and no complicity of tilt angle. In this work new filter has been demonstrated on magnetic data from an area in Sar- Cheshme region in Iran. This area is located in 55 degree longitude and 32 degree latitude and is a copper potential region. The main formation in this area is Andesith and Trachyandezite. Magnetic surveying was employed to separate the boundaries of Andezite and Trachyandezite from adjacent area. In this regard a variety of filters such as analytic signal, tilt angle and ASTA filter have been applied which

  8. Improvements in Off Design Aeroengine Performance Prediction Using Analytic Compressor Map Interpolation

    Science.gov (United States)

    Mist'e, Gianluigi Alberto; Benini, Ernesto

    2012-06-01

    Compressor map interpolation is usually performed through the introduction of auxiliary coordinates (β). In this paper, a new analytical bivariate β function definition to be used in compressor map interpolation is studied. The function has user-defined parameters that must be adjusted to properly fit to a single map. The analytical nature of β allows for rapid calculations of the interpolation error estimation, which can be used as a quantitative measure of interpolation accuracy and also as a valid tool to compare traditional β function interpolation with new approaches (artificial neural networks, genetic algorithms, etc.). The quality of the method is analyzed by comparing the error output to the one of a well-known state-of-the-art methodology. This comparison is carried out for two different types of compressor and, in both cases, the error output using the method presented in this paper is found to be consistently lower. Moreover, an optimization routine able to locally minimize the interpolation error by shape variation of the β function is implemented. Further optimization introducing other important criteria is discussed.

  9. Freedom: A Promise of Possibility.

    Science.gov (United States)

    Bunkers, Sandra Schmidt

    2015-10-01

    The idea of freedom as a promise of possibility is explored in this column. The core concepts from a research study on considering tomorrow (Bunkers, 1998) coupled with humanbecoming community change processes (Parse, 2003) are used to illuminate this notion. The importance of intentionality in human freedom is discussed from both a human science and a natural science perspective. © The Author(s) 2015.

  10. Locally analytic vectors in representations of locally

    CERN Document Server

    Emerton, Matthew J

    2017-01-01

    The goal of this memoir is to provide the foundations for the locally analytic representation theory that is required in three of the author's other papers on this topic. In the course of writing those papers the author found it useful to adopt a particular point of view on locally analytic representation theory: namely, regarding a locally analytic representation as being the inductive limit of its subspaces of analytic vectors (of various "radii of analyticity"). The author uses the analysis of these subspaces as one of the basic tools in his study of such representations. Thus in this memoir he presents a development of locally analytic representation theory built around this point of view. The author has made a deliberate effort to keep the exposition reasonably self-contained and hopes that this will be of some benefit to the reader.

  11. User-Centered Evaluation of Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jean C.

    2017-10-01

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference between usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to

  12. Hazardous Waste Landfill Siting using GIS Technique and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Ozeair Abessi

    2010-07-01

    Full Text Available Disposal of large amount of generated hazardous waste in power plants, has always received communities' and authori¬ties attentions. In this paper using site screening method and Analytical Hierarchy Process (AHP a sophisticated approach for siting hazardous waste landfill in large areas is presented. This approach demonstrates how the evaluation criteria such as physical, socio-economical, technical, environmental and their regulatory sub criteria can be introduced into an over layer technique to screen some limited appropriate zones in the area. Then, in order to find the optimal site amongst the primary screened site utilizing a Multiple Criteria Decision Making (MCDM method for hierarchy computations of the process is recommended. Using the introduced method an accurate siting procedure for environmental planning of the landfills in an area would be enabled. In the study this approach was utilized for disposal of hazardous wastes of Shahid Rajaee thermal power plant located in Qazvin province west central part of Iran. As a result of this study 10 suitable zones were screened in the area at first, then using analytical hierarchy process a site near the power plant were chosen as the optimal site for landfilling of the hazardous wastes in Qazvin province.

  13. An active learning representative subset selection method using net analyte signal

    Science.gov (United States)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  14. Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective

    Science.gov (United States)

    Dietz-Uhler, Beth; Hurn, Janet E.

    2013-01-01

    Learning analytics is receiving increased attention, in part because it offers to assist educational institutions in increasing student retention, improving student success, and easing the burden of accountability. Although these large-scale issues are worthy of consideration, faculty might also be interested in how they can use learning analytics…

  15. Analytical Plan for Roman Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.; Schwantes, Jon M.; Olszta, Matthew J.; Thevuthasan, Suntharampillai; Heeren, Ronald M.

    2011-01-01

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University of Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.

  16. Transport methods: general. 1. The Analytical Monte Carlo Method for Radiation Transport Calculations

    International Nuclear Information System (INIS)

    Martin, William R.; Brown, Forrest B.

    2001-01-01

    We present an alternative Monte Carlo method for solving the coupled equations of radiation transport and material energy. This method is based on incorporating the analytical solution to the material energy equation directly into the Monte Carlo simulation for the radiation intensity. This method, which we call the Analytical Monte Carlo (AMC) method, differs from the well known Implicit Monte Carlo (IMC) method of Fleck and Cummings because there is no discretization of the material energy equation since it is solved as a by-product of the Monte Carlo simulation of the transport equation. Our method also differs from the method recently proposed by Ahrens and Larsen since they use Monte Carlo to solve both equations, while we are solving only the radiation transport equation with Monte Carlo, albeit with effective sources and cross sections to represent the emission sources. Our method bears some similarity to a method developed and implemented by Carter and Forest nearly three decades ago, but there are substantive differences. We have implemented our method in a simple zero-dimensional Monte Carlo code to test the feasibility of the method, and the preliminary results are very promising, justifying further extension to more realistic geometries. (authors)

  17. Evaluation of gamma dose effect on PIN photodiode using analytical model

    Science.gov (United States)

    Jafari, H.; Feghhi, S. A. H.; Boorboor, S.

    2018-03-01

    The PIN silicon photodiodes are widely used in the applications which may be found in radiation environment such as space mission, medical imaging and non-destructive testing. Radiation-induced damage in these devices causes to degrade the photodiode parameters. In this work, we have used new approach to evaluate gamma dose effects on a commercial PIN photodiode (BPX65) based on an analytical model. In this approach, the NIEL parameter has been calculated for gamma rays from a 60Co source by GEANT4. The radiation damage mechanisms have been considered by solving numerically the Poisson and continuity equations with the appropriate boundary conditions, parameters and physical models. Defects caused by radiation in silicon have been formulated in terms of the damage coefficient for the minority carriers' lifetime. The gamma induced degradation parameters of the silicon PIN photodiode have been analyzed in detail and the results were compared with experimental measurements and as well as the results of ATLAS semiconductor simulator to verify and parameterize the analytical model calculations. The results showed reasonable agreement between them for BPX65 silicon photodiode irradiated by 60Co gamma source at total doses up to 5 kGy under different reverse voltages.

  18. The legal and ethical concerns that arise from using complex predictive analytics in health care.

    Science.gov (United States)

    Cohen, I Glenn; Amarasingham, Ruben; Shah, Anand; Xie, Bin; Lo, Bernard

    2014-07-01

    Predictive analytics, or the use of electronic algorithms to forecast future events in real time, makes it possible to harness the power of big data to improve the health of patients and lower the cost of health care. However, this opportunity raises policy, ethical, and legal challenges. In this article we analyze the major challenges to implementing predictive analytics in health care settings and make broad recommendations for overcoming challenges raised in the four phases of the life cycle of a predictive analytics model: acquiring data to build the model, building and validating it, testing it in real-world settings, and disseminating and using it more broadly. For instance, we recommend that model developers implement governance structures that include patients and other stakeholders starting in the earliest phases of development. In addition, developers should be allowed to use already collected patient data without explicit consent, provided that they comply with federal regulations regarding research on human subjects and the privacy of health information. Project HOPE—The People-to-People Health Foundation, Inc.

  19. An analytic thomism?

    Directory of Open Access Journals (Sweden)

    Daniel Alejandro Pérez Chamorro.

    2012-12-01

    Full Text Available For 50 years the philosophers of the Anglo-Saxon analytic tradition (E. Anscombre, P. Geach, A. Kenny, P. Foot have tried to follow the Thomas Aquinas School which they use as a source to surpass the Cartesian Epistemology and to develop the virtue ethics. Recently, J. Haldane has inaugurated a program of “analytical thomism” which main result until the present has been his “theory of identity mind/world”. Nevertheless, none of Thomás’ admirers has still found the means of assimilating his metaphysics of being.

  20. Using Big Data Analytics to Advance Precision Radiation Oncology.

    Science.gov (United States)

    McNutt, Todd R; Benedict, Stanley H; Low, Daniel A; Moore, Kevin; Shpitser, Ilya; Jiang, Wei; Lakshminarayanan, Pranav; Cheng, Zhi; Han, Peijin; Hui, Xuan; Nakatsugawa, Minoru; Lee, Junghoon; Moore, Joseph A; Robertson, Scott P; Shah, Veeraj; Taylor, Russ; Quon, Harry; Wong, John; DeWeese, Theodore

    2018-06-01

    Big clinical data analytics as a primary component of precision medicine is discussed, identifying where these emerging tools fit in the spectrum of genomics and radiomics research. A learning health system (LHS) is conceptualized that uses clinically acquired data with machine learning to advance the initiatives of precision medicine. The LHS is comprehensive and can be used for clinical decision support, discovery, and hypothesis derivation. These developing uses can positively impact the ultimate management and therapeutic course for patients. The conceptual model for each use of clinical data, however, is different, and an overview of the implications is discussed. With advancements in technologies and culture to improve the efficiency, accuracy, and breadth of measurements of the patient condition, the concept of an LHS may be realized in precision radiation therapy. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Is a Nuclear Deal with Iran Possible? An Analytical Framework for the Iran Nuclear Negotiations

    OpenAIRE

    Sebenius, James Kimble; Singh, Michael K.

    2012-01-01

    Varied diplomatic approaches by multiple negotiators over several years have failed to conclude a nuclear deal with Iran. Mutual hostility, misperception, and flawed diplomacy may be responsible. Yet, more fundamentally, no mutually acceptable deal may exist. To assess this possibility, a "negotiation analytic" framework conceptually disentangles two issues: 1) whether a feasible deal exists and 2) how to design the most promising process to achieve one. Focusing on whether a "zone of possibl...

  2. Writing analytic element programs in Python.

    Science.gov (United States)

    Bakker, Mark; Kelson, Victor A

    2009-01-01

    The analytic element method is a mesh-free approach for modeling ground water flow at both the local and the regional scale. With the advent of the Python object-oriented programming language, it has become relatively easy to write analytic element programs. In this article, an introduction is given of the basic principles of the analytic element method and of the Python programming language. A simple, yet flexible, object-oriented design is presented for analytic element codes using multiple inheritance. New types of analytic elements may be added without the need for any changes in the existing part of the code. The presented code may be used to model flow to wells (with either a specified discharge or drawdown) and streams (with a specified head). The code may be extended by any hydrogeologist with a healthy appetite for writing computer code to solve more complicated ground water flow problems. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  3. Using Photocatalytic Oxidation and Analytic Techniques to Remediate Lab Wastewater Containing Methanol

    Science.gov (United States)

    Xiong, Qing; Luo, Mingliang; Bao, Xiaoming; Deng, Yurong; Qin, Song; Pu, Xuemei

    2018-01-01

    This experiment is dedicated to second-year and above undergraduates who are in their experimental session of the analytical chemistry course. Grouped students are required to use a TiO[subscript 2] photocatalytic oxidation process to treat the methanol-containing wastewater that resulted from their previous HPLC experiments. Students learn to…

  4. Using Google Tag Manager and Google Analytics to track DSpace metadata fields as custom dimensions

    Directory of Open Access Journals (Sweden)

    Suzanna Conrad

    2015-01-01

    Full Text Available DSpace can be problematic for those interested in tracking download and pageview statistics granularly. Some libraries have implemented code to track events on websites and some have experimented with using Google Tag Manager to automate event tagging in DSpace. While these approaches make it possible to track download statistics, granular details such as authors, content types, titles, advisors, and other fields for which metadata exist are generally not tracked in DSpace or Google Analytics without coding. Moreover, it can be time consuming to track and assess pageview data and relate that data back to particular metadata fields. This article will detail the learning process of incorporating custom dimensions for tracking these detailed fields including trial and error attempts to use the data import function manually in Google Analytics, to automate the data import using Google APIs, and finally to automate the collection of dimension data in Google Tag Manager by mimicking SEO practices for capturing meta tags. This specific case study refers to using Google Tag Manager and Google Analytics with DSpace; however, this method may also be applied to other types of websites or systems.

  5. Hair elemental analysis for forensic science using nuclear and related analytical methods

    Czech Academy of Sciences Publication Activity Database

    Kučera, Jan; Kameník, Jan; Havránek, Vladimír

    2018-01-01

    Roč. 7, č. 3 (2018), s. 65-74 ISSN 2468-1709 R&D Projects: GA ČR(CZ) GBP108/12/G108; GA MŠk LM2015056 Institutional support: RVO:61389005 Keywords : hair * forensic analysis * neutron activation analysis * particle induced X-ray emission Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry

  6. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  7. The Usefulness of Analytical Procedures - An Empirical Approach in the Auditing Sector in Portugal

    Directory of Open Access Journals (Sweden)

    Carlos Pinho

    2014-08-01

    Full Text Available The conceptual conflict between the efficiency and efficacy on financial auditing arises from the fact that resources are scarce, both in terms of the time available to carry out the audit and the quality and timeliness of the information available to the external auditor. Audits tend to be more efficient, the lower the combination of inherent risk and control risk is assessed to be, allowing the auditor to carry out less extensive and less timely auditing tests, meaning that in some cases analytical audit procedures are a good tool to support the opinions formed by the auditor. This research, by means of an empirical study of financial auditing in Portugal, aims to evaluate the extent to which analytical procedures are used during a financial audit engagement in Portugal, throughout the different phases involved in auditing. The conclusions point to the fact that, in general terms and regardless of the size of the audit company and the way in which professionals work, Portuguese auditors use analytical procedures more frequently during the planning phase rather than during the phase of evidence gathering and the phase of opinion formation.

  8. Evaluation of challenges of wood imports to Iran using Fuzzy Delphi Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    amin arian

    2017-08-01

    Full Text Available Abstract:Considering the increasing consumption of wood and wood products in Iran and limited domestic sources of wood and shortage of wood raw material in Iran, wood raw material imports is a solution for Iranian developing wood industries' wood procurement.But, wood imports to Iran, always faced with a lot of challenges. The aim of this research is to determine and evaluate the challenges in the way of wood imports to Iran. The research method used in this study is a descriptive-analytic method. The analytic method used in the study to evaluate the challenges is the Fuzzy Delphi Analytical Hierarchy Process (FDAHP. First the findings of previous researches in the field and the literature were studied and doing interviews with industry experts, the challenges in the way of wood imports to Iran were extracted and classified in 5 groups and 35 factors and were evaluated.The results shows that in the first level (groups the regulation, economic, politic, infrastructure and management groups have the most importance respectively. In second level (challenges, plant protection regulations have the most importance. After that, exchange rate tolerance, oil income, banking support and GDP have most importance respectively.

  9. Analytic American Option Pricing and Applications

    OpenAIRE

    Sbuelz, A.

    2003-01-01

    I use a convenient value breakdown in order to obtain analytic solutions for finitematurity American option prices.Such a barrier-option-based breakdown yields an analytic lower bound for the American option price, which is as price-tight as the Barone-Adesi and Whaley (1987) analytic value proxy for short and medium maturities and exhibits good convergence to the Merton (1973) perpetual option price for large maturities.

  10. Local and global trust based on the concept of promises

    NARCIS (Netherlands)

    Bergstra, J.; Burgess, M.

    2009-01-01

    We use the notion of a promise to define local trust between agents possessing autonomous decision-making. An agent is trustworthy if it is expected that it will keep a promise. This definition satisfies most commonplace meanings of trust. Reputation is then an estimation of this expectation value

  11. Medical Data Analytics Is Not a Simple Task.

    Science.gov (United States)

    Babič, František; Vadovský, Michal; Paralič, Ján

    2018-01-01

    Data analytics represents a new chance for medical diagnosis and treatment to make it more effective and successful. This expectation is not so easy to achieve as it may look like at a first glance. The medical experts, doctors or general practitioners have their own vocabulary, they use specific terms and type of speaking. On the other side, data analysts have to understand the task and to select the right algorithms. The applicability of the results depends on the effectiveness of the interactions between those two worlds. This paper presents our experiences with various medical data samples in form of SWOT analysis. We identified the most important input attributes for the target diagnosis or extracted decision rules and analysed their interestingness with cooperating doctors, for most promising new cut-off values or an investigation of possible important relations hidden in data sample. In general, this type of knowledge can be used for clinical decision support, but it has to be evaluated on different samples, conditions and ideally in long-term studies. Sometimes, the interaction needed much more time than we expected at the beginning but our experiences are mostly positive.

  12. Curriculum Innovation for Marketing Analytics

    Science.gov (United States)

    Wilson, Elizabeth J.; McCabe, Catherine; Smith, Robert S.

    2018-01-01

    College graduates need better preparation for and experience in data analytics for higher-quality problem solving. Using the curriculum innovation framework of Borin, Metcalf, and Tietje (2007) and case study research methods, we offer rich insights about one higher education institution's work to address the marketing analytics skills gap.…

  13. Analytical procedure in aseismic design of eccentric structure using response spectrum

    International Nuclear Information System (INIS)

    Takemori, T.; Kuwabara, Y.; Suwabe, A.; Mitsunobu, S.

    1977-01-01

    In this paper, the response are evaluated by the following two methods by the use of the typical torsional analytical models in which masses, rigidities, eccentricities between the centers thereof and several actual earthquake waves are taken as the parameters: (1) the root mean square of responses by using the response spectra derived from the earthquake waves, (2) the time history analysis by using the earthquake wave. The earthquake waves used are chosen to present the different frequency content and magnitude of the response spectra. The typical results derived from the study are as follows: (a) the response accelerations of mass center in the input earthquake direction by the (1) method coincide comparatively well with those by the (2) method, (b) the response accelerations perpendicular to the input earthquake direction by (1) method are 2 to 3 times as much as those by the (2) method, (c) the amplification of the response accelerations at arbitrary points distributed on the spread mass to those of center of the lumped mass by the (1) method are remarkably large compared with those by the (2) method in both directions respectively. These problems on the response spectrum analysis for the above-mentioned eccentric structure are discussed, and an improved analytical method applying the amplification coefficients of responses derived from this parametric time history analysis is proposed to the actual seismic design by the using of the given design ground response spectrum with root mean square technique

  14. An Accurate Approximate-Analytical Technique for Solving Time-Fractional Partial Differential Equations

    Directory of Open Access Journals (Sweden)

    M. Bishehniasar

    2017-01-01

    Full Text Available The demand of many scientific areas for the usage of fractional partial differential equations (FPDEs to explain their real-world systems has been broadly identified. The solutions may portray dynamical behaviors of various particles such as chemicals and cells. The desire of obtaining approximate solutions to treat these equations aims to overcome the mathematical complexity of modeling the relevant phenomena in nature. This research proposes a promising approximate-analytical scheme that is an accurate technique for solving a variety of noninteger partial differential equations (PDEs. The proposed strategy is based on approximating the derivative of fractional-order and reducing the problem to the corresponding partial differential equation (PDE. Afterwards, the approximating PDE is solved by using a separation-variables technique. The method can be simply applied to nonhomogeneous problems and is proficient to diminish the span of computational cost as well as achieving an approximate-analytical solution that is in excellent concurrence with the exact solution of the original problem. In addition and to demonstrate the efficiency of the method, it compares with two finite difference methods including a nonstandard finite difference (NSFD method and standard finite difference (SFD technique, which are popular in the literature for solving engineering problems.

  15. Service Users perspectives in PROMISE and research.

    Science.gov (United States)

    Rae, Sarah

    2017-09-01

    Since its inception in 2013, PROMISE (PROactive Management of Integrated Services and Environments) has been supporting service users and staff at the Cambridgeshire and Peterborough NHS Foundation Trust (CPFT) on a journey to reduce reliance on force. The author's own personal experiences led to the founding of PROMISE and illustrates how individual experiences can influence a patient to lead change. Coproduction is actively embedded in PROMISE. Patients have been meaningfully involved because they are innovators and problem solvers who bring an alternative viewpoint by the very nature of their condition. A patient is more than just a person who needs to be 'fixed' they are individuals with untapped skills and added insight. There have been 2 separate Patient Advisory Groups (PAGs) since the project was first established. The first Patient Advisory Group was recruited to work with the PROMISE researchers on a study which used a participatory qualitative approach. Drawing on their lived experience and different perspectives the PAG was instrumental in shaping the qualitative study, including the research questions. Their active involvement helped to ensure that that the study was sensitively designed, methodologically robust and ethically sound. The 2 nd PAG was formed in 2016 to give the project an overall steer. Patients in this group contributed to the work on the 'No' Audit and reviewed several CPFT policies such as the Seclusion and Segregation policy which has impacted on frontline practice. They also made a significant contribution to the study design for a funding application that was submitted by the PROMISE team to the National Institute for Health Research (NIHR). Both PAGs were supported by funding from East of England Collaboration for Leadership in Applied Health Research and Care (CLAHRC EoE) and were influential in different ways. An evaluation of the 2 nd PAG which was conducted in June 2017 showed very high satisfaction levels. The free text

  16. Evaluating supplier quality performance using fuzzy analytical hierarchy process

    Science.gov (United States)

    Ahmad, Nazihah; Kasim, Maznah Mat; Rajoo, Shanmugam Sundram Kalimuthu

    2014-12-01

    Evaluating supplier quality performance is vital in ensuring continuous supply chain improvement, reducing the operational costs and risks towards meeting customer's expectation. This paper aims to illustrate an application of Fuzzy Analytical Hierarchy Process to prioritize the evaluation criteria in a context of automotive manufacturing in Malaysia. Five main criteria were identified which were quality, cost, delivery, customer serviceand technology support. These criteria had been arranged into hierarchical structure and evaluated by an expert. The relative importance of each criteria was determined by using linguistic variables which were represented as triangular fuzzy numbers. The Center of Gravity defuzzification method was used to convert the fuzzy evaluations into their corresponding crisps values. Such fuzzy evaluation can be used as a systematic tool to overcome the uncertainty evaluation of suppliers' performance which usually associated with human being subjective judgments.

  17. Analytical model for real time, noninvasive estimation of blood glucose level.

    Science.gov (United States)

    Adhyapak, Anoop; Sidley, Matthew; Venkataraman, Jayanti

    2014-01-01

    The paper presents an analytical model to estimate blood glucose level from measurements made non-invasively and in real time by an antenna strapped to a patient's wrist. Some promising success has been shown by the RIT ETA Lab research group that an antenna's resonant frequency can track, in real time, changes in glucose concentration. Based on an in-vitro study of blood samples of diabetic patients, the paper presents a modified Cole-Cole model that incorporates a factor to represent the change in glucose level. A calibration technique using the input impedance technique is discussed and the results show a good estimation as compared to the glucose meter readings. An alternate calibration methodology has been developed that is based on the shift in the antenna resonant frequency using an equivalent circuit model containing a shunt capacitor to represent the shift in resonant frequency with changing glucose levels. Work under progress is the optimization of the technique with a larger sample of patients.

  18. Predictive Analytics in Information Systems Research

    OpenAIRE

    Shmueli, Galit; Koppius, Otto

    2011-01-01

    textabstractThis research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as well as methods for assessing predictive power. Predictive analytics not only assist in creating practically useful models, they also play an important role alongside explanatory modeling in theory bu...

  19. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  20. Evaluating supplier quality performance using analytical hierarchy process

    Science.gov (United States)

    Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah

    2013-09-01

    This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.

  1. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  2. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  3. Promise Zones for Applicants

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists applicants to HUD's Promise Zone initiative prepare data to submit with their application by allowing applicants to draw the exact location of the...

  4. The Promise and Perils of Using Big Data in the Study of Corporate Networks

    DEFF Research Database (Denmark)

    Heemskerk, Eelke; Young, Kevin; Takes, Frank W.

    2018-01-01

    problems. While acknowledging that different research questions require different approaches to data quality, we offer a schematic platform that researchers can follow to make informed and intelligent decisions about BCND issues and address these through a specific work-flow procedure. For each step...... challenges associated with the nature of the subject matter, variable data quality and other problems associated with currently available data on this scale, we discuss the promise and perils of using big corporate network data (BCND). We propose a standard procedure for helping researchers deal with BCND...

  5. Competing on talent analytics.

    Science.gov (United States)

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  6. Analytical modeling and analysis of magnetic field and torque for novel axial flux eddy current couplers with PM excitation

    Science.gov (United States)

    Li, Zhao; Wang, Dazhi; Zheng, Di; Yu, Linxin

    2017-10-01

    Rotational permanent magnet eddy current couplers are promising devices for torque and speed transmission without any mechanical contact. In this study, flux-concentration disk-type permanent magnet eddy current couplers with double conductor rotor are investigated. Given the drawback of the accurate three-dimensional finite element method, this paper proposes a mixed two-dimensional analytical modeling approach. Based on this approach, the closed-form expressions of magnetic field, eddy current, electromagnetic force and torque for such devices are obtained. Finally, a three-dimensional finite element method is employed to validate the analytical results. Besides, a prototype is manufactured and tested for the torque-speed characteristic.

  7. Fluxball magnetic field analysis using a hybrid analytical/FEM/BEM with equivalent currents

    International Nuclear Information System (INIS)

    Fernandes, João F.P.; Camilo, Fernando M.; Machado, V. Maló

    2016-01-01

    In this paper, a fluxball electric machine is analyzed concerning the magnetic flux, force and torque. A novel method is proposed based in a special hybrid FEM/BEM (Finite Element Method/Boundary Element Method) with equivalent currents by using an analytical treatment for the source field determination. The method can be applied to evaluate the magnetic field in axisymmetric problems, in the presence of several magnetic materials. Same results obtained by a commercial Finite Element Analysis tool are presented for validation purposes with the proposed method. - Highlights: • The Fluxball machine magnetic field is analyzed by a new FEM/BEM/Analytical method. • The method is adequate for axisymmetric non homogeneous magnetic field problems. • The source magnetic field is evaluated considering a non-magnetic equivalent problem. • Material magnetization vectors are accounted by using equivalent currents. • A strong reduction of the finite element domain is achieved.

  8. Analytic continuation in perturbative QCD

    International Nuclear Information System (INIS)

    Caprini, Irinel

    2002-01-01

    We discuss some attempts to improve standard perturbative expansion in QCD by using the analytic continuation in the momentum and the Borel complex planes. We first analyse the momentum-plane analyticity properties of the Borel-summed Green functions in perturbative QCD and the connection between the Landau singularities and the infrared renormalons. By using the analytic continuation in the Borel complex plane, we propose a new perturbative series replacing the standard expansion in powers of the normalized coupling constant a. The new expansion functions have branch point and essential singularities at the origin of the complex a-plane and divergent Taylor expansions in powers of a. On the other hand the modified expansion of the QCD correlators is convergent under rather conservative conditions. (author)

  9. Pellet manufacturing by extrusion-spheronization using process analytical technology

    DEFF Research Database (Denmark)

    Sandler, Niklas; Rantanen, Jukka; Heinämäki, Jyrki

    2005-01-01

    The aim of this study was to investigate the phase transitions occurring in nitrofurantoin and theophylline formulations during pelletization by extrusion-spheronization. An at-line process analytical technology (PAT) approach was used to increase the understanding of the solid-state behavior...... of the active pharmaceutical ingredients (APIs) during pelletization. Raman spectroscopy, near-infrared (NIR) spectroscopy, and X-ray powder diffraction (XRPD) were used in the characterization of polymorphic changes during the process. Samples were collected at the end of each processing stage (blending......, granulation, extrusion, spheronization, and drying). Batches were dried at 3 temperature levels (60 degrees C, 100 degrees C, and 135 degrees C). Water induced a hydrate formation in both model formulations during processing. NIR spectroscopy gave valuable real-time data about the state of water in the system...

  10. Human eye analytical and mesh-geometry models for ophthalmic dosimetry using MCNP6

    International Nuclear Information System (INIS)

    Angelocci, Lucas V.; Fonseca, Gabriel P.; Yoriyaz, Helio

    2015-01-01

    Eye tumors can be treated with brachytherapy using Co-60 plaques, I-125 seeds, among others materials. The human eye has regions particularly vulnerable to ionizing radiation (e.g. crystalline) and dosimetry for this region must be taken carefully. A mathematical model was proposed in the past [1] for the eye anatomy to be used in Monte Carlo simulations to account for dose distribution in ophthalmic brachytherapy. The model includes the description for internal structures of the eye that were not treated in previous works. The aim of this present work was to develop a new eye model based on the Mesh geometries of the MCNP6 code. The methodology utilized the ABAQUS/CAE (Simulia 3DS) software to build the Mesh geometry. For this work, an ophthalmic applicator containing up to 24 model Amersham 6711 I-125 seeds (Oncoseed) was used, positioned in contact with a generic tumor defined analytically inside the eye. The absorbed dose in eye structures like cornea, sclera, choroid, retina, vitreous body, lens, optical nerve and optical nerve wall were calculated using both models: analytical and MESH. (author)

  11. Human eye analytical and mesh-geometry models for ophthalmic dosimetry using MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Angelocci, Lucas V.; Fonseca, Gabriel P.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Eye tumors can be treated with brachytherapy using Co-60 plaques, I-125 seeds, among others materials. The human eye has regions particularly vulnerable to ionizing radiation (e.g. crystalline) and dosimetry for this region must be taken carefully. A mathematical model was proposed in the past [1] for the eye anatomy to be used in Monte Carlo simulations to account for dose distribution in ophthalmic brachytherapy. The model includes the description for internal structures of the eye that were not treated in previous works. The aim of this present work was to develop a new eye model based on the Mesh geometries of the MCNP6 code. The methodology utilized the ABAQUS/CAE (Simulia 3DS) software to build the Mesh geometry. For this work, an ophthalmic applicator containing up to 24 model Amersham 6711 I-125 seeds (Oncoseed) was used, positioned in contact with a generic tumor defined analytically inside the eye. The absorbed dose in eye structures like cornea, sclera, choroid, retina, vitreous body, lens, optical nerve and optical nerve wall were calculated using both models: analytical and MESH. (author)

  12. Vibration Based Diagnosis for Planetary Gearboxes Using an Analytical Model

    Directory of Open Access Journals (Sweden)

    Liu Hong

    2016-01-01

    Full Text Available The application of conventional vibration based diagnostic techniques to planetary gearboxes is a challenge because of the complexity of frequency components in the measured spectrum, which is the result of relative motions between the rotary planets and the fixed accelerometer. In practice, since the fault signatures are usually contaminated by noises and vibrations from other mechanical components of gearboxes, the diagnostic efficacy may further deteriorate. Thus, it is essential to develop a novel vibration based scheme to diagnose gear failures for planetary gearboxes. Following a brief literature review, the paper begins with the introduction of an analytical model of planetary gear-sets developed by the authors in previous works, which can predict the distinct behaviors of fault introduced sidebands. This analytical model is easy to implement because the only prerequisite information is the basic geometry of the planetary gear-set. Afterwards, an automated diagnostic scheme is proposed to cope with the challenges associated with the characteristic configuration of planetary gearboxes. The proposed vibration based scheme integrates the analytical model, a denoising algorithm, and frequency domain indicators into one synergistic system for the detection and identification of damaged gear teeth in planetary gearboxes. Its performance is validated with the dynamic simulations and the experimental data from a planetary gearbox test rig.

  13. Analytical Solution and Physics of a Propellant Damping Device

    Science.gov (United States)

    Yang, H. Q.; Peugeot, John

    2011-01-01

    NASA design teams have been investigating options for "detuning" Ares I to prevent oscillations originating in the vehicle solid-rocket main stage from synching up with the natural resonance of the rest of the vehicle. An experimental work started at NASA MSFC center in 2008 using a damping device showed great promise in damping the vibration level of an 8 resonant tank. However, the mechanisms of the vibration damping were not well understood and there were many unknowns such as the physics, scalability, technology readiness level (TRL), and applicability for the Ares I vehicle. The objectives of this study are to understand the physics of intriguing slosh damping observed in the experiments, to further validate a Computational Fluid Dynamics (CFD) software in propellant sloshing against experiments with water, and to study the applicability and efficiency of the slosh damper to a full scale propellant tank and to cryogenic fluids. First a 2D fluid-structure interaction model is built to model the system resonance of liquid sloshing and structure vibration. A damper is then added into the above model to simulate experimentally observed system damping phenomena. Qualitative agreement is found. An analytical solution is then derived from the Newtonian dynamics for the thrust oscillation damper frequency, and a slave mass concept is introduced in deriving the damper and tank interaction dynamics. The paper will elucidate the fundamental physics behind the LOX damper success from the derivation of the above analytical equation of the lumped Newtonian dynamics. Discussion of simulation results using high fidelity multi-phase, multi-physics, fully coupled CFD structure interaction model will show why the LOX damper is unique and superior compared to other proposed mitigation techniques.

  14. Simulation of reactive geochemical transport in groundwater using a semi-analytical screening model

    Science.gov (United States)

    McNab, Walt W.

    1997-10-01

    A reactive geochemical transport model, based on a semi-analytical solution to the advective-dispersive transport equation in two dimensions, is developed as a screening tool for evaluating the impact of reactive contaminants on aquifer hydrogeochemistry. Because the model utilizes an analytical solution to the transport equation, it is less computationally intensive than models based on numerical transport schemes, is faster, and it is not subject to numerical dispersion effects. Although the assumptions used to construct the model preclude consideration of reactions between the aqueous and solid phases, thermodynamic mineral saturation indices are calculated to provide qualitative insight into such reactions. Test problems involving acid mine drainage and hydrocarbon biodegradation signatures illustrate the utility of the model in simulating essential hydrogeochemical phenomena.

  15. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  16. The Promise of Zoomable User Interfaces

    Science.gov (United States)

    Bederson, Benjamin B.

    2011-01-01

    Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…

  17. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    Energy Technology Data Exchange (ETDEWEB)

    Ragan, Eric D [ORNL; Goodall, John R [ORNL

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  18. Analytical calculation of the average scattering cross sections using fourier series

    Energy Technology Data Exchange (ETDEWEB)

    Palma, Daniel A.P. [Instituto Federal do Rio de Janeiro, Nilopolis, RJ (Brazil)], e-mail: dpalmaster@gmail.com; Goncalves, Alessandro C.; Martinez, Aquilino S.; Silva, Fernando C. da [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear], e-mail: asilva@con.ufrj.br, e-mail: agoncalves@con.ufrj.br, e-mail: aquilino@lmp.ufrj.br, e-mail: fernando@con.ufrj.br

    2009-07-01

    The precise determination of the Doppler broadening functions is very important in different applications of reactors physics, mainly in the processing of nuclear data. Analytical approximations are obtained in this paper for average scattering cross section using expansions in Fourier series, generating an approximation that is simple and precise. The results have shown to be satisfactory from the point-of-view of accuracy and do not depend on the type of resonance considered. (author)

  19. Analytical calculation of the average scattering cross sections using fourier series

    International Nuclear Information System (INIS)

    Palma, Daniel A.P.; Goncalves, Alessandro C.; Martinez, Aquilino S.; Silva, Fernando C. da

    2009-01-01

    The precise determination of the Doppler broadening functions is very important in different applications of reactors physics, mainly in the processing of nuclear data. Analytical approximations are obtained in this paper for average scattering cross section using expansions in Fourier series, generating an approximation that is simple and precise. The results have shown to be satisfactory from the point-of-view of accuracy and do not depend on the type of resonance considered. (author)

  20. Surface-enhanced Raman spectroscopy (SERS) in food analytics: Detection of vitamins B2 and B12 in cereals.

    Science.gov (United States)

    Radu, Andreea Ioana; Kuellmer, Maria; Giese, Bernd; Huebner, Uwe; Weber, Karina; Cialla-May, Dana; Popp, Jürgen

    2016-11-01

    Food analysis has been gaining interest throughout recent decades for different reasons: the detection of hazardous substances in food and routine investigations of food composition and vitamin/nutrient contents. Regardless of the targeted component, food analysis raises a few challenges regarding the complexity of the matrix and detecting trace amounts of substances. We report herein the results obtained regarding the simultaneous detection of two B vitamins (riboflavin, vitamin B2 and cyanocobalamin, vitamin B12) by means of SERS. SERS provides molecular fingerprint identification and high analytical sensitivity together with a low processing time and cost. All these make SERS a promising tool for the development of food analytical methods. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. MERRA Analytic Services

    Science.gov (United States)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  2. Target normal sheath acceleration analytical modeling, comparative study and developments

    International Nuclear Information System (INIS)

    Perego, C.; Batani, D.; Zani, A.; Passoni, M.

    2012-01-01

    Ultra-intense laser interaction with solid targets appears to be an extremely promising technique to accelerate ions up to several MeV, producing beams that exhibit interesting properties for many foreseen applications. Nowadays, most of all the published experimental results can be theoretically explained in the framework of the target normal sheath acceleration (TNSA) mechanism proposed by Wilks et al. [Phys. Plasmas 8(2), 542 (2001)]. As an alternative to numerical simulation various analytical or semi-analytical TNSA models have been published in the latest years, each of them trying to provide predictions for some of the ion beam features, given the initial laser and target parameters. However, the problem of developing a reliable model for the TNSA process is still open, which is why the purpose of this work is to enlighten the present situation of TNSA modeling and experimental results, by means of a quantitative comparison between measurements and theoretical predictions of the maximum ion energy. Moreover, in the light of such an analysis, some indications for the future development of the model proposed by Passoni and Lontano [Phys. Plasmas 13(4), 042102 (2006)] are then presented.

  3. Analytical methods used in plutonium purification cycles by trilaurylamine

    International Nuclear Information System (INIS)

    Perez, J.J.

    1965-01-01

    The utilisation of trilaurylamine as a solvent extractant for the purification of plutonium has entailed to perfect a set of analytical methods which involves, various techniques. The organic impurities of the solvent can be titrated by gas-liquid chromatography. The titration of the main degradation product, the di-laurylamine, can be accomplished also by spectro-colorimetry. Potentiometry is used for the analysis of the different salts of amine-nitrate-sulfate-bisulfate as also the extracted nitric acid. The determination of the nitrate in aqueous phase is carried out by constant current potentiometry. The range of application, the accuracy and the procedure of these analysis are related in the present report. (author) [fr

  4. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    Science.gov (United States)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  5. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  6. Gravitational-wave astronomy: delivering on the promises

    Science.gov (United States)

    Schutz, B. F.

    2018-05-01

    Now that LIGO and Virgo have begun to detect gravitational-wave events with regularity, the field of gravitational-wave astronomy is beginning to realize its promise. Binary black holes and, very recently, binary neutron stars have been observed, and we are already learning much from them. The future, with improved sensitivity, more detectors and detectors like LISA in different frequency bands, has even more promise to open a completely hidden side of the Universe to our exploration. This article is part of a discussion meeting issue `The promises of gravitational-wave astronomy'.

  7. The path to fulfilling the promise

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, J. [Canadian Nuclear Association, Ottawa, ON (Canada)

    2014-07-01

    'Full text:'Countries work together to develop effective governance and regulation. Canada has made big investments in these areas and it carries a premium for us. The rapid build-out of nuclear technology around the Pacific Rim holds vast promise for our populations in better climate, better air, affordable and reliable electricity, and longer lives. The biggest risk is not another accident: rather, it is the risk of failing to fulfill that promise to our people. Every country that wants the benefits of nuclear must also want to be sure that those benefits are realized and sustained by good governance and regulation. Canada has the people, laws, organizations, public institutions, and relationships that can help our partners fulfill the whole and lasting promise of nuclear technology. (author)

  8. Analyte-Size-Dependent Ionization and Quantification of Monosaccharides in Human Plasma Using Cation-Exchanged Smectite Layers.

    Science.gov (United States)

    Ding, Yuqi; Kawakita, Kento; Xu, Jiawei; Akiyama, Kazuhiko; Fujino, Tatsuya

    2015-08-04

    Smectite, a synthetic inorganic polymer with a saponite structure, was subjected to matrix-assisted laser desorption/ionization mass spectrometry (MALDI MS). Typical organic matrix molecules 2,4,6-trihydroxyacetophenone (THAP) and 2,5-dihydroxybenzoic acid (DHBA) were intercalated into the layer spacing of cation-exchanged smectite, and the complex was used as a new matrix for laser desorption/ionization mass spectrometry. Because of layer spacing limitations, only a small analyte that could enter the layer and bind to THAP or DHBA could be ionized. This was confirmed by examining different analyte/matrix preparation methods and by measuring saccharides with different molecular sizes. Because of the homogeneous distribution of THAP molecules in the smectite layer spacing, high reproducibility of the analyte peak intensity was achieved. By using isotope-labeled (13)C6-d-glucose as the internal standard, quantitative analysis of monosaccharides in pretreated human plasma sample was performed, and the value of 8.6 ± 0.3 μg/mg was estimated.

  9. Reactor Section standard analytical methods. Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Sowden, D.

    1954-07-01

    the Standard Analytical Methods manual was prepared for the purpose of consolidating and standardizing all current analytical methods and procedures used in the Reactor Section for routine chemical analyses. All procedures are established in accordance with accepted practice and the general analytical methods specified by the Engineering Department. These procedures are specifically adapted to the requirements of the water treatment process and related operations. The methods included in this manual are organized alphabetically within the following five sections which correspond to the various phases of the analytical control program in which these analyses are to be used: water analyses, essential material analyses, cotton plug analyses boiler water analyses, and miscellaneous control analyses.

  10. Big data in health care: using analytics to identify and manage high-risk and high-cost patients.

    Science.gov (United States)

    Bates, David W; Saria, Suchi; Ohno-Machado, Lucila; Shah, Anand; Escobar, Gabriel

    2014-07-01

    The US health care system is rapidly adopting electronic health records, which will dramatically increase the quantity of clinical data that are available electronically. Simultaneously, rapid progress has been made in clinical analytics--techniques for analyzing large quantities of data and gleaning new insights from that analysis--which is part of what is known as big data. As a result, there are unprecedented opportunities to use big data to reduce the costs of health care in the United States. We present six use cases--that is, key examples--where some of the clearest opportunities exist to reduce costs through the use of big data: high-cost patients, readmissions, triage, decompensation (when a patient's condition worsens), adverse events, and treatment optimization for diseases affecting multiple organ systems. We discuss the types of insights that are likely to emerge from clinical analytics, the types of data needed to obtain such insights, and the infrastructure--analytics, algorithms, registries, assessment scores, monitoring devices, and so forth--that organizations will need to perform the necessary analyses and to implement changes that will improve care while reducing costs. Our findings have policy implications for regulatory oversight, ways to address privacy concerns, and the support of research on analytics. Project HOPE—The People-to-People Health Foundation, Inc.

  11. News Analytics for Financial Decision Support

    NARCIS (Netherlands)

    V. Milea (Viorel)

    2013-01-01

    textabstractThis PhD thesis contributes to the newly emerged, growing body of scientific work on the use of News Analytics in Finance. Regarded as the next significant development in Automated Trading, News Analytics extends trading algorithms to incorporate information extracted from textual

  12. Analytic approaches to relativistic hydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hatta, Yoshitaka

    2016-12-15

    I summarize our recent work towards finding and utilizing analytic solutions of relativistic hydrodynamic. In the first part I discuss various exact solutions of the second-order conformal hydrodynamics. In the second part I compute flow harmonics v{sub n} analytically using the anisotropically deformed Gubser flow and discuss its dependence on n, p{sub T}, viscosity, the chemical potential and the charge.

  13. A New Class of Analytic Functions Defined by Using Salagean Operator

    Directory of Open Access Journals (Sweden)

    R. M. El-Ashwah

    2013-01-01

    Full Text Available We derive some results for a new class of analytic functions defined by using Salagean operator. We give some properties of functions in this class and obtain numerous sharp results including for example, coefficient estimates, distortion theorem, radii of star-likeness, convexity, close-to-convexity, extreme points, integral means inequalities, and partial sums of functions belonging to this class. Finally, we give an application involving certain fractional calculus operators that are also considered.

  14. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Koch, Nicholas C; Newhauser, Wayne D

    2010-01-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  15. 3-D Discrete Analytical Ridgelet Transform

    OpenAIRE

    Helbert , David; Carré , Philippe; Andrès , Éric

    2006-01-01

    International audience; In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines:...

  16. Hydraulic modeling of riverbank filtration systems with curved boundaries using analytic elements and series solutions

    Science.gov (United States)

    Bakker, Mark

    2010-08-01

    A new analytic solution approach is presented for the modeling of steady flow to pumping wells near rivers in strip aquifers; all boundaries of the river and strip aquifer may be curved. The river penetrates the aquifer only partially and has a leaky stream bed. The water level in the river may vary spatially. Flow in the aquifer below the river is semi-confined while flow in the aquifer adjacent to the river is confined or unconfined and may be subject to areal recharge. Analytic solutions are obtained through superposition of analytic elements and Fourier series. Boundary conditions are specified at collocation points along the boundaries. The number of collocation points is larger than the number of coefficients in the Fourier series and a solution is obtained in the least squares sense. The solution is analytic while boundary conditions are met approximately. Very accurate solutions are obtained when enough terms are used in the series. Several examples are presented for domains with straight and curved boundaries, including a well pumping near a meandering river with a varying water level. The area of the river bottom where water infiltrates into the aquifer is delineated and the fraction of river water in the well water is computed for several cases.

  17. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  18. Semi-analytical MBS Pricing

    DEFF Research Database (Denmark)

    Rom-Poulsen, Niels

    2007-01-01

    This paper presents a multi-factor valuation model for fixed-rate callable mortgage backed securities (MBS). The model yields semi-analytic solutions for the value of MBS in the sense that the MBS value is found by solving a system of ordinary differential equations. Instead of modelling the cond......This paper presents a multi-factor valuation model for fixed-rate callable mortgage backed securities (MBS). The model yields semi-analytic solutions for the value of MBS in the sense that the MBS value is found by solving a system of ordinary differential equations. Instead of modelling...... interest rate model. However, if the pool size is specified in a way that makes the expectations solvable using transform methods, semi-analytic pricing formulas are achieved. The affine and quadratic pricing frameworks are combined to get flexible and sophisticated prepayment functions. We show...

  19. Practical web analytics for user experience how analytics can help you understand your users

    CERN Document Server

    Beasley, Michael

    2013-01-01

    Practical Web Analytics for User Experience teaches you how to use web analytics to help answer the complicated questions facing UX professionals. Within this book, you'll find a quantitative approach for measuring a website's effectiveness and the methods for posing and answering specific questions about how users navigate a website. The book is organized according to the concerns UX practitioners face. Chapters are devoted to traffic, clickpath, and content use analysis, measuring the effectiveness of design changes, including A/B testing, building user profiles based on search hab

  20. Guide to Savannah River Laboratory Analytical Services Group

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    The mission of the Analytical Services Group (ASG) is to provide analytical support for Savannah River Laboratory Research and Development Programs using onsite and offsite analytical labs as resources. A second mission is to provide Savannah River Site (SRS) operations with analytical support for nonroutine material characterization or special chemical analyses. The ASG provides backup support for the SRS process control labs as necessary.

  1. Guide to Savannah River Laboratory Analytical Services Group

    International Nuclear Information System (INIS)

    1990-04-01

    The mission of the Analytical Services Group (ASG) is to provide analytical support for Savannah River Laboratory Research and Development Programs using onsite and offsite analytical labs as resources. A second mission is to provide Savannah River Site (SRS) operations with analytical support for nonroutine material characterization or special chemical analyses. The ASG provides backup support for the SRS process control labs as necessary

  2. News analytics for financial decision support

    NARCIS (Netherlands)

    Milea, D.V.

    2013-01-01

    This PhD thesis contributes to the newly emerged, growing body of scientific work on the use of News Analytics in Finance. Regarded as the next significant development in Automated Trading, News Analytics extends trading algorithms to incorporate information extracted from textual messages, by

  3. Predictive Data Tools Find Uses in Schools

    Science.gov (United States)

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  4. Optimization of offshore wind turbine support structures using analytical gradient-based method

    OpenAIRE

    Chew, Kok Hon; Tai, Kang; Ng, E.Y.K.; Muskulus, Michael

    2015-01-01

    Design optimization of the offshore wind turbine support structure is an expensive task; due to the highly-constrained, non-convex and non-linear nature of the design problem. This report presents an analytical gradient-based method to solve this problem in an efficient and effective way. The design sensitivities of the objective and constraint functions are evaluated analytically while the optimization of the structure is performed, subject to sizing, eigenfrequency, extreme load an...

  5. Selection of power market structure using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Subhes Bhattacharyya; Prasanta Kumar Dey

    2003-01-01

    Selection of a power market structure from the available alternatives is an important activity within an overall power sector reform program. The evaluation criteria for selection are both subjective as well as objective in nature and the selection of alternatives is characterised by their conflicting nature. This study demonstrates a methodology for power market structure selection using the analytic hierarchy process, a multiple attribute decision- making technique, to model the selection methodology with the active participation of relevant stakeholders in a workshop environment. The methodology is applied to a hypothetical case of a State Electricity Board reform in India. (author)

  6. Analytical Chemistry Division's sample transaction system

    International Nuclear Information System (INIS)

    Stanton, J.S.; Tilson, P.A.

    1980-10-01

    The Analytical Chemistry Division uses the DECsystem-10 computer for a wide range of tasks: sample management, timekeeping, quality assurance, and data calculation. This document describes the features and operating characteristics of many of the computer programs used by the Division. The descriptions are divided into chapters which cover all of the information about one aspect of the Analytical Chemistry Division's computer processing

  7. Tennessee Promise: A Response to Organizational Change

    Science.gov (United States)

    Littlepage, Ben; Clark, Teresa; Wilson, Randal; Stout, Logan

    2018-01-01

    Community colleges in Tennessee, either directly or indirectly, experienced unprecedented change as a result of Tennessee Promise. The present study explored how student support service administrators at three community colleges responded to organizational change as a result of the Tennessee Promise legislation. Investigators selected community…

  8. Gravitational-wave astronomy: delivering on the promises.

    Science.gov (United States)

    Schutz, B F

    2018-05-28

    Now that LIGO and Virgo have begun to detect gravitational-wave events with regularity, the field of gravitational-wave astronomy is beginning to realize its promise. Binary black holes and, very recently, binary neutron stars have been observed, and we are already learning much from them. The future, with improved sensitivity, more detectors and detectors like LISA in different frequency bands, has even more promise to open a completely hidden side of the Universe to our exploration.This article is part of a discussion meeting issue 'The promises of gravitational-wave astronomy'. © 2018 The Author(s).

  9. Semi-analytical approach for guided mode resonance in high-index-contrast photonic crystal slab: TE polarization.

    Science.gov (United States)

    Yang, Yi; Peng, Chao; Li, Zhengbin

    2013-09-09

    In high-contrast (HC) photonic crystals (PC) slabs, the high-order coupling is so intense that it is indispensable for analyzing the guided mode resonance (GMR) effect. In this paper, a semi-analytical approach is proposed for analyzing GMR in HC PC slabs with TE-like polarization. The intense high-order coupling is included by using a convergent recursive procedure. The reflection of radiative waves at high-index-contrast interfaces is also considered by adopting a strict Green's function for multi-layer structures. Modal properties of interest like band structure, radiation constant, field profile are calculated, agreeing well with numerical finite-difference time-domain simulations. This analysis is promising for the design and optimization of various HC PC devices.

  10. Analytical methods for fissionable materials in the nuclear fuel cycle. Covering June 1974--June 1975

    International Nuclear Information System (INIS)

    Waterbury, G.R.

    1975-10-01

    Research progress is reported on method development for the dissolution of difficult-to-dissolve materials, the automated analysis of plutonium and uranium, the preparation of plutonium materials for the Safeguard Analytical Laboratory Evaluation (SALE) Program, and the analysis of HTGR fuel and SALE uranium materials. The previously developed Teflon-container, metal-shell apparatus was applied to the dissolution of various nuclear materials. Gas--solid reactions, mainly using chlorine at elevated temperatures, are promising for separating uranium from refractory compounds. An automated spectrophotometer designed for determining plutonium and uranium was tested successfully. Procedures were developed for this instrument to analyze uranium--plutonium mixtures and the effects of diverse ions upon the analysis of plutonium and uranium were further established. A versatile apparatus was assembled to develop electrotitrimetric methods that will serve as the basis for precise automated determinations of plutonium. Plutonium materials prepared for the Safeguard Analytical Laboratory Evaluation (SALE) Program were plutonium oxide, uranium--plutonium mixed oxide, and plutonium metal. Improvements were made in the methods used for determining uranium in HTGR fuel materials and SALE uranium materials. Plutonium metal samples were prepared, characterized, and distributed, and half-life measurements were in progress as part of an inter-ERDA-laboratory program to measure accurately the half-lives of long-lived plutonium isotopes

  11. Learning Analytics across a Statewide System

    Science.gov (United States)

    Buyarski, Catherine; Murray, Jim; Torstrick, Rebecca

    2017-01-01

    This chapter explores lessons learned from two different learning analytics efforts at a large, public, multicampus university--one internally developed and one vended platform. It raises questions about how to best use analytics to support students while keeping students responsible for their own learning and success.

  12. 40 CFR 141.704 - Analytical methods.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Analytical methods. 141.704 Section... Monitoring Requirements § 141.704 Analytical methods. (a) Cryptosporidium. Systems must analyze for Cryptosporidium using Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA, 2005, United States...

  13. Are Higher Education Institutions Prepared for Learning Analytics?

    Science.gov (United States)

    Ifenthaler, Dirk

    2017-01-01

    Higher education institutions and involved stakeholders can derive multiple benefits from learning analytics by using different data analytics strategies to produce summative, real-time, and predictive insights and recommendations. However, are institutions and academic as well as administrative staff prepared for learning analytics? A learning…

  14. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  15. Special concrete shield selection using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Abulfaraj, W.H.

    1994-01-01

    Special types of concrete radiation shields that depend on locally available materials and have improved properties for both neutron and gamma-ray attenuation were developed by using plastic materials and heavy ores. The analytic hierarchy process (AHP) is implemented to evaluate these types for selecting the best biological radiation shield for nuclear reactors. Factors affecting the selection decision are degree of protection against neutrons, degree of protection against gamma rays, suitability of the concrete as building material, and economic considerations. The seven concrete alternatives are barite-polyethylene concrete, barite-polyvinyl chloride (PVC) concrete, barite-portland cement concrete, pyrite-polyethylene concrete, pyrite-PVC concrete, pyrite-portland cement concrete, and ordinary concrete. The AHP analysis shows the superiority of pyrite-polyethylene concrete over the others

  16. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    Directory of Open Access Journals (Sweden)

    Miroslav Pohanka

    2015-06-01

    Full Text Available Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE. The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.. It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance.

  17. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  18. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  19. The role of strategic position in brand promise: Evidence from LG Company

    Directory of Open Access Journals (Sweden)

    A. Eilaghi Karvandi

    2016-08-01

    Full Text Available This paper presents an empirical investigation to study the effects of different strategies including attribute, advantage, application, consumer, competitive advantage, pricing/quality and category on brand promise for products of LG Company in city of Tehran, Iran. The study designs two questionnaires, one for strategic positioning and the other for brand promise in Likert scale. Cronbach alphas for brand promise and strategic positioning are 0.81 and 0.79, respectively. The questionnaires are distributed among 385 randomly selected regular users of LG products and using Spearman correlation as well as Stepwise regression techniques, the effects of various strategies on brand promise are examined. The results of the implementation of Spearman correlation have indicated that there were positive and meaningful relationships between different strategies and brand promise. In addition, the results of Stepwise regression have indicated that three strategies of price/quality, consumer and application were the most important predictors of brand promise.

  20. An Empirical Comparison of Algorithms to Find Communities in Directed Graphs and Their Application in Web Data Analytics

    DEFF Research Database (Denmark)

    Agreste, Santa; De Meo, Pasquale; Fiumara, Giacomo

    2017-01-01

    Detecting communities in graphs is a fundamental tool to understand the structure of Web-based systems and predict their evolution. Many community detection algorithms are designed to process undirected graphs (i.e., graphs with bidirectional edges) but many graphs on the Web-e.g., microblogging ...... the best trade-off between accuracy and computational performance and, therefore, it has to be considered as a promising tool for Web Data Analytics purposes....

  1. Analytic solutions of hydrodynamics equations

    International Nuclear Information System (INIS)

    Coggeshall, S.V.

    1991-01-01

    Many similarity solutions have been found for the equations of one-dimensional (1-D) hydrodynamics. These special combinations of variables allow the partial differential equations to be reduced to ordinary differential equations, which must then be solved to determine the physical solutions. Usually, these reduced ordinary differential equations are solved numerically. In some cases it is possible to solve these reduced equations analytically to obtain explicit solutions. In this work a collection of analytic solutions of the 1-D hydrodynamics equations is presented. These can be used for a variety of purposes, including (i) numerical benchmark problems, (ii) as a basis for analytic models, and (iii) to provide insight into more complicated solutions

  2. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  3. Theory of net analyte signal vectors in inverse regression

    DEFF Research Database (Denmark)

    Bro, R.; Andersen, Charlotte Møller

    2003-01-01

    The. net analyte signal and the net analyte signal vector are useful measures in building and optimizing multivariate calibration models. In this paper a theory for their use in inverse regression is developed. The theory of net analyte signal was originally derived from classical least squares...

  4. Some questions of using coding theory and analytical calculation methods on computers

    International Nuclear Information System (INIS)

    Nikityuk, N.M.

    1987-01-01

    Main results of investigations devoted to the application of theory and practice of correcting codes are presented. These results are used to create very fast units for the selection of events registered in multichannel detectors of nuclear particles. Using this theory and analytical computing calculations, practically new combination devices, for example, parallel encoders, have been developed. Questions concerning the creation of a new algorithm for the calculation of digital functions by computers and problems of devising universal, dynamically reprogrammable logic modules are discussed

  5. Optimizing an immersion ESL curriculum using analytic hierarchy process.

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-11-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Drachsler, Hendrik; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  7. Use of analytical electron microscopy and auger electron spectroscopy for evaluating materials

    International Nuclear Information System (INIS)

    Jones, R.H.; Bruemmer, S.M.; Thomas, M.T.; Baer, D.R.

    1982-11-01

    Analytical electron microscopy (AEM) can be used to characterize the microstructure and microchemistry of materials over dimensions less than 10 nm while Auger electron spectroscopy (AES) can be used to characterize the chemical composition of surfaces and interfaces to a depth of less than 1 nm. Frequently, the information gained from both instruments can be coupled to give new insight into the behavior of materials. Examples of the use of AEM and AES to characterize segregation, sensitization and radiation damage are presented. A short description of the AEM and AES techniques are given

  8. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  9. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    Science.gov (United States)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  10. Promise-based management: the essence of execution.

    Science.gov (United States)

    Sull, Donald N; Spinosa, Charles

    2007-04-01

    Critical initiatives stall for a variety of reasons--employee disengagement, a lack of coordination between functions, complex organizational structures that obscure accountability, and so on. To overcome such obstacles, managers must fundamentally rethink how work gets done. Most of the challenges stem from broken or poorly crafted commitments. That's because every company is, at its heart, a dynamic network of promises made between employees and colleagues, customers, outsourcing partners, or other stakeholders. Executives can overcome many problems in the short-term and foster productive, reliable workforces for the long-term by practicing what the authors call "promise-based management," which involves cultivating and coordinating commitments in a systematic way. Good promises share five qualities: They are public, active, voluntary, explicit, and mission based. To develop and execute an effective promise, the "provider" and the "customer" in the deal should go through three phases of conversation. The first, achieving a meeting of minds, entails exploring the fundamental questions of coordinated effort: What do you mean? Do you understand what I mean? What should I do? What will you do? Who else should we talk to? In the next phase, making it happen, the provider executes on the promise. In the final phase, closing the loop, the customer publicly declares that the provider has either delivered the goods or failed to do so. Leaders must weave and manage their webs of promises with great care-encouraging iterative conversation and making sure commitments are fulfilled reliably. If they do, they can enhance coordination and cooperation among colleagues, build the organizational agility required to seize new business opportunities, and tap employees' entrepreneurial energies.

  11. Using Streaming Analytics for Effective Real Time Network Visibility -

    Science.gov (United States)

    on in your network right now. Certainly the other thing that we talked about on the big data side was [inaudible] data. So now we'll drill into - so this is all the traffic from the internal network to the taking a streaming analytics approach to network traffic analysis. So we can go to the next - there we go

  12. AI based HealthCare Platform for Real Time, Predictive and Prescriptive Analytics using Reactive Programming

    Science.gov (United States)

    Kaur, Jagreet; Singh Mann, Kulwinder, Dr.

    2018-01-01

    AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.

  13. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    Directory of Open Access Journals (Sweden)

    Lakshmi Narayana Suvarapu

    2015-01-01

    Full Text Available This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed.

  14. Predictive analytics and child protection: constraints and opportunities.

    Science.gov (United States)

    Russell, Jesse

    2015-08-01

    This paper considers how predictive analytics might inform, assist, and improve decision making in child protection. Predictive analytics represents recent increases in data quantity and data diversity, along with advances in computing technology. While the use of data and statistical modeling is not new to child protection decision making, its use in child protection is experiencing growth, and efforts to leverage predictive analytics for better decision-making in child protection are increasing. Past experiences, constraints and opportunities are reviewed. For predictive analytics to make the most impact on child protection practice and outcomes, it must embrace established criteria of validity, equity, reliability, and usefulness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Prioritizing the countries for BOT nuclear power project using Analytic Hierarchy Process

    International Nuclear Information System (INIS)

    Choi, Sun Woo; Roh, Myung Sub

    2013-01-01

    This paper proposes factors influencing the success of BOT nuclear power projects and their weighting method using Analytic Hierarchy Process (AHP) to find the optimal country which developer intends to develop. To summarize, this analytic method enable the developer to select and focus on the country which has preferable circumstance so that it enhances the efficiency of the project promotion by minimizing the opportunity cost. Also, it enables the developer to quantify the qualitative factors so that it diversifies the project success strategy and policy for the targeted country. Although the performance of this study is insufficient due to the limitation of time, small sampling and security of materials, it still has the possibility to improve the analytic model more systematically through further study with more data. Developing Build-Own(or Operate)-Transfer (BOT) nuclear power project carrying large capital in the long term requires initially well-made multi-decision which it prevents sorts of risks from unexpected situation of targeted countries. Moreover, the nuclear power project in most case is practically implemented by Government to Government cooperation, so the key concern for such nuclear power project would be naturally focused on the country situation rather than project viability at planning stage. In this regard, it requires the evaluation of targeted countries before involving the project, comprehensive and proper decision making for complex judgment factors, and efficient integration of expert's opinions, etc. Therefore, prioritizing and evaluating the feasibility of country for identification of optimal project region is very meaningful study

  16. Using a Merit-Based Scholarship Program to Increase Rates of College Enrollment in an Urban School District: The Case of the Pittsburgh Promise

    Science.gov (United States)

    Bozick, Robert; Gonzalez, Gabriella; Engberg, John

    2015-01-01

    The Pittsburgh Promise is a scholarship program that provides $5,000 per year toward college tuition for public high school graduates in Pittsburgh, Pennsylvania who earned a 2.5 GPA and a 90% attendance record. This study used a difference-in-difference design to assess whether the introduction of the Promise scholarship program directly…

  17. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    Science.gov (United States)

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. New trends in analytical chemistry. Volume 2

    International Nuclear Information System (INIS)

    Zyka, J.

    1984-01-01

    The book consists of 8 chapters and describes modern methods of analytical chemistry. The chapters Moessbauer spectroscopy, Neutron activation analysis, and Analytical uses of particle-induced characteristic X radiation (PIXE) describe the principles of these methods, the used experimental equip=-ment, methods of evaluation, modification of methods and examples of practical uses. (M.D.)

  19. Development of collaborative-creative learning model using virtual laboratory media for instrumental analytical chemistry lectures

    Science.gov (United States)

    Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian

    2017-08-01

    The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.

  20. Promising designs of compact heat exchangers for modular HTRs using the Brayton cycle

    International Nuclear Information System (INIS)

    Pra, Franck; Tochon, Patrice; Mauget, Christian; Fokkens, Jan; Willemsen, Sander

    2008-01-01

    The presented study was carried out within the Work Package 2 'Recuperator' of the High Temperature Reactor-E European program. High Temperature gas cooled Reactor concepts with a direct cycle have become potentially interesting for the future. Theoretically, these concepts provide higher efficiency than a classical steam cycle. Within the Brayton cycle the helium/helium recuperator, required to achieve the high efficiency, has to work under very harsh conditions (temperature, pressure, and pressure difference between circuits). Within the project the most promising technologies for the compact recuperator were investigated. First, the requirements for the recuperator to operate under the direct Brayton cycle have been defined. Based on these requirements the various potential technologies available on the market have been investigated. Two particular technologies (HEATRIC Printed Circuit Heat Exchanger, NORDON plate fin concept) have been selected as most promising. For the former, a precise description has been given and a mock-up has been fabricated and tested in the Claire loop at CEA. In the Claire loop the Printed Circuit Heat Exchanger mock-up has been subjected to thermal shocks, which are considered to be representative for a recuperator. Prior to the experimental testing coupled Computational Fluid Dynamic (CFD) and Finite Element analyses have been performed to give insight into the thermal and mechanical behaviour of the mock-ups during the thermal shock. Based on these results the experimental measuring program has been optimized. Upon completion of the tests the experimental and numerical results have been compared. Based on the results from the investigation performed recommendations are given for the full-size recuperator using the selected technologies

  1. Analytical approximations for wide and narrow resonances

    International Nuclear Information System (INIS)

    Suster, Luis Carlos; Martinez, Aquilino Senra; Silva, Fernando Carvalho da

    2005-01-01

    This paper aims at developing analytical expressions for the adjoint neutron spectrum in the resonance energy region, taking into account both narrow and wide resonance approximations, in order to reduce the numerical computations involved. These analytical expressions, besides reducing computing time, are very simple from a mathematical point of view. The results obtained with this analytical formulation were compared to a reference solution obtained with a numerical method previously developed to solve the neutron balance adjoint equations. Narrow and wide resonances of U 238 were treated and the analytical procedure gave satisfactory results as compared with the reference solution, for the resonance energy range. The adjoint neutron spectrum is useful to determine the neutron resonance absorption, so that multigroup adjoint cross sections used by the adjoint diffusion equation can be obtained. (author)

  2. Analytical approximations for wide and narrow resonances

    Energy Technology Data Exchange (ETDEWEB)

    Suster, Luis Carlos; Martinez, Aquilino Senra; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: aquilino@lmp.ufrj.br

    2005-07-01

    This paper aims at developing analytical expressions for the adjoint neutron spectrum in the resonance energy region, taking into account both narrow and wide resonance approximations, in order to reduce the numerical computations involved. These analytical expressions, besides reducing computing time, are very simple from a mathematical point of view. The results obtained with this analytical formulation were compared to a reference solution obtained with a numerical method previously developed to solve the neutron balance adjoint equations. Narrow and wide resonances of U{sup 238} were treated and the analytical procedure gave satisfactory results as compared with the reference solution, for the resonance energy range. The adjoint neutron spectrum is useful to determine the neutron resonance absorption, so that multigroup adjoint cross sections used by the adjoint diffusion equation can be obtained. (author)

  3. Evaluating the Effectiveness of the Chemistry Education by Using the Analytic Hierarchy Process

    Science.gov (United States)

    Yüksel, Mehmet

    2012-01-01

    In this study, an attempt was made to develop a method of measurement and evaluation aimed at overcoming the difficulties encountered in the determination of the effectiveness of chemistry education based on the goals of chemistry education. An Analytic Hierarchy Process (AHP), which is a multi-criteria decision technique, is used in the present…

  4. Characterization of dilation-analytic operators

    Energy Technology Data Exchange (ETDEWEB)

    Balslev, E; Grossmann, A; Paul, T

    1986-01-01

    Dilation analytic vectors and operators are characterized in a new representation of quantum mechanical states through functions analytic on the upper half-plane. In this space H/sub o/-bounded operators are integral operators and criteria for dilation analyticity are given in terms of analytic continuation outside of the half-plane for functions and for kernels. A sufficient condition is given for an integral operator in momentum space to be dilation-analytic.

  5. Some analytical aspects of the Moessbauer spectrometry

    International Nuclear Information System (INIS)

    Meisel, W.

    1975-01-01

    Analytical applications of Moessbauer spectrometry are reviewed. Various methods of analysis (qualitative, semiquantitative and quantitative) using the Moessbauer effect are dealt with. Sensitivity and accuracy of Moessbauer spectrometry in analytical applications are discussed. (Z.S.)

  6. Analytical and computational study of magnetization switching in kinetic Ising systems with demagnetizing fields

    DEFF Research Database (Denmark)

    Richards, H.L.; Rikvold, P.A.

    1996-01-01

    particularly promising as materials for high-density magnetic recording media. In this paper we use analytic arguments and Monte Carlo simulations to quantitatively study the effects of the demagnetizing field on the dynamics of magnetization switching in two-dimensional, single-domain, kinetic Ising systems....... For systems in the weak-field ''stochastic region,'' where magnetization switching is on average effected by the nucleation and growth of a single droplet, the simulation results can be explained by a simple model in which the free energy is a function only of magnetization. In the intermediate......-field ''multidroplet region,'' a generalization of Avrami's law involving a magnetization-dependent effective magnetic field gives good agreement with the simulations. The effects of the demagnetizing field do not qualitatively change the droplet-theoretical picture of magnetization switching in highly anisotropic...

  7. Young children mostly keep, and expect others to keep, their promises.

    Science.gov (United States)

    Kanngiesser, Patricia; Köymen, Bahar; Tomasello, Michael

    2017-07-01

    Promises are speech acts that create an obligation to do the promised action. In three studies, we investigated whether 3- and 5-year-olds (N=278) understand the normative implications of promising in prosocial interactions. In Study 1, children helped a partner who promised to share stickers. When the partner failed to uphold the promise, 3- and 5-year-olds protested and referred to promise norms. In Study 2, when children in this same age range were asked to promise to continue a cleaning task-and they agreed-they persisted longer on the task and mentioned their obligation more frequently than without such a promise. They also persisted longer after a promise than after a cleaning reminder (Study 3). In prosocial interactions, thus, young children feel a normative obligation to keep their promises and expect others to keep their promises as well. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Analytical Pyrolysis-Chromatography: Something Old, Something New

    Science.gov (United States)

    Bower, Nathan W.; Blanchet, Conor J. K.

    2010-01-01

    Despite a long history of use across multiple disciplines, analytical pyrolysis is rarely taught in undergraduate curricula. We briefly review some interesting applications and discuss the three types of analytical pyrolyzers available commercially. We also describe a low-cost alternative that can be used to teach the basic principles of…

  9. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    Science.gov (United States)

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  10. Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)

    Science.gov (United States)

    Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.

  11. SeeDB: Efficient Data-Driven Visualization Recommendations to Support Visual Analytics.

    Science.gov (United States)

    Vartak, Manasi; Rahman, Sajjadur; Madden, Samuel; Parameswaran, Aditya; Polyzotis, Neoklis

    2015-09-01

    Data analysts often build visualizations as the first step in their analytical workflow. However, when working with high-dimensional datasets, identifying visualizations that show relevant or desired trends in data can be laborious. We propose SeeDB, a visualization recommendation engine to facilitate fast visual analysis: given a subset of data to be studied, SeeDB intelligently explores the space of visualizations, evaluates promising visualizations for trends, and recommends those it deems most "useful" or "interesting". The two major obstacles in recommending interesting visualizations are (a) scale : evaluating a large number of candidate visualizations while responding within interactive time scales, and (b) utility : identifying an appropriate metric for assessing interestingness of visualizations. For the former, SeeDB introduces pruning optimizations to quickly identify high-utility visualizations and sharing optimizations to maximize sharing of computation across visualizations. For the latter, as a first step, we adopt a deviation-based metric for visualization utility, while indicating how we may be able to generalize it to other factors influencing utility. We implement SeeDB as a middleware layer that can run on top of any DBMS. Our experiments show that our framework can identify interesting visualizations with high accuracy. Our optimizations lead to multiple orders of magnitude speedup on relational row and column stores and provide recommendations at interactive time scales. Finally, we demonstrate via a user study the effectiveness of our deviation-based utility metric and the value of recommendations in supporting visual analytics.

  12. Characterization of Analytical Reference Glass-1 (ARG-1)

    International Nuclear Information System (INIS)

    Smith, G.L.

    1993-12-01

    High-level radioactive waste may be immobilized in borosilicate glass at the West Valley Demonstration Project, West Valley, New York, the Defense Waste Processing Facility (DWPF), Aiken, South Carolina, and the Hanford Waste Vitrification Project (HWVP), Richland, Washington. The vitrified waste form will be stored in stainless steel canisters before its eventual transfer to a geologic repository for long-term disposal. Waste Acceptance Product Specifications (WAPS) (DOE 1993), Section 1.1.2 requires that the waste form producers must report the measured chemical composition of the vitrified waste in their production records before disposal. Chemical analysis of glass waste forms is receiving increased attention due to qualification requirements of vitrified waste forms. The Pacific Northwest Laboratory (PNL) has been supporting the glass producers' analytical laboratories by a continuing program of multilaboratory analytical testing using interlaboratory ''round robin'' methods. At the PNL Materials Characterization Center Analytical Round Robin 4 workshop ''Analysis of Nuclear Waste Glass and Related Materials,'' January 16--17, 1990, Pleasanton, California, the meeting attendees decided that simulated nuclear waste analytical reference glasses were needed for use as analytical standards. Use of common standard analytical reference materials would allow the glass producers' analytical laboratories to calibrate procedures and instrumentation, to control laboratory performance and conduct self-appraisals, and to help qualify their various waste forms

  13. Smart city analytics

    DEFF Research Database (Denmark)

    Hansen, Casper; Hansen, Christian; Alstrup, Stephen

    2017-01-01

    We present an ensemble learning method that predicts large increases in the hours of home care received by citizens. The method is supervised, and uses different ensembles of either linear (logistic regression) or non-linear (random forests) classifiers. Experiments with data available from 2013 ...... is very useful when full records are not accessible or available. Smart city analytics does not necessarily require full city records. To our knowledge this preliminary study is the first to predict large increases in home care for smart city analytics.......We present an ensemble learning method that predicts large increases in the hours of home care received by citizens. The method is supervised, and uses different ensembles of either linear (logistic regression) or non-linear (random forests) classifiers. Experiments with data available from 2013...... to 2017 for every citizen in Copenhagen receiving home care (27,775 citizens) show that prediction can achieve state of the art performance as reported in similar health related domains (AUC=0.715). We further find that competitive results can be obtained by using limited information for training, which...

  14. Single-analyte to multianalyte fluorescence sensors

    Science.gov (United States)

    Lavigne, John J.; Metzger, Axel; Niikura, Kenichi; Cabell, Larry A.; Savoy, Steven M.; Yoo, J. S.; McDevitt, John T.; Neikirk, Dean P.; Shear, Jason B.; Anslyn, Eric V.

    1999-05-01

    The rational design of small molecules for the selective complexation of analytes has reached a level of sophistication such that there exists a high degree of prediction. An effective strategy for transforming these hosts into sensors involves covalently attaching a fluorophore to the receptor which displays some fluorescence modulation when analyte is bound. Competition methods, such as those used with antibodies, are also amenable to these synthetic receptors, yet there are few examples. In our laboratories, the use of common dyes in competition assays with small molecules has proven very effective. For example, an assay for citrate in beverages and an assay for the secondary messenger IP3 in cells have been developed. Another approach we have explored focuses on multi-analyte sensor arrays with attempt to mimic the mammalian sense of taste. Our system utilizes polymer resin beads with the desired sensors covalently attached. These functionalized microspheres are then immobilized into micromachined wells on a silicon chip thereby creating our taste buds. Exposure of the resin to analyte causes a change in the transmittance of the bead. This change can be fluorescent or colorimetric. Optical interrogation of the microspheres, by illuminating from one side of the wafer and collecting the signal on the other, results in an image. These data streams are collected using a CCD camera which creates red, green and blue (RGB) patterns that are distinct and reproducible for their environments. Analysis of this data can identify and quantify the analytes present.

  15. Exploring maintenance policy selection using the Analytic Hierarchy Process : an application for naval ships

    NARCIS (Netherlands)

    Goossens, A.J.M.; Basten, R.J.I.

    2015-01-01

    In this paper we investigate maintenance policy selection (MPS) through the use of the Analytic Hierarchy Process (AHP). A maintenance policy is a policy that dictates which parameter triggers a maintenance action. In practice, selecting the right maintenance policy appears to be a difficult

  16. Analytical Prediction of the Spin Stabilized Satellite's Attitude Using The Solar Radiation Torque

    International Nuclear Information System (INIS)

    Motta, G B; Carvalho, M V; Zanardi, M C

    2013-01-01

    The aim of this paper is to present an analytical solution for the spin motion equations of spin-stabilized satellite considering only the influence of solar radiation torque. The theory uses a cylindrical satellite on a circular orbit and considers that the satellite is always illuminated. The average components of this torque were determined over an orbital period. These components are substituted in the spin motion equations in order to get an analytical solution for the right ascension and declination of the satellite spin axis. The time evolution for the pointing deviation of the spin axis was also analyzed. These solutions were numerically implemented and compared with real data of the Brazilian Satellite of Data Collection – SCD1 an SCD2. The results show that the theory has consistency and can be applied to predict the spin motion of spin-stabilized artificial satellites

  17. Analytical admittance characterization of high mobility channel

    Energy Technology Data Exchange (ETDEWEB)

    Mammeri, A. M.; Mahi, F. Z., E-mail: fati-zo-mahi2002@yahoo.fr [Institute of Science and Technology, University of Bechar (Algeria); Varani, L. [Institute of Electronics of the South (IES - CNRS UMR 5214), University of Montpellier (France)

    2015-03-30

    In this contribution, we investigate the small-signal admittance of the high electron mobility transistors field-effect channels under a continuation branching of the current between channel and gate by using an analytical model. The analytical approach takes into account the linearization of the 2D Poisson equation and the drift current along the channel. The analytical equations discuss the frequency dependence of the admittance at source and drain terminals on the geometrical transistor parameters.

  18. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  19. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  20. Measurement of company effectiveness using analytic network process method

    Science.gov (United States)

    Goran, Janjić; Zorana, Tanasić; Borut, Kosec

    2017-07-01

    The sustainable development of an organisation is monitored through the organisation's performance, which beforehand incorporates all stakeholders' requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP) to define the weight factors of the mutual influences of all the important elements of an organisation's strategy. The calculation of an organisation's effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation's business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation's most important measures.

  1. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    International Nuclear Information System (INIS)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García; Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D.; Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz

    2015-01-01

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs

  2. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García [Instituto de Astronomía Teórica y Experimental, CONICET-UNC, Laprida 854, X5000BGR, Córdoba (Argentina); Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D. [Consejo Nacional de Investigaciones Científicas y Técnicas, Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz, E-mail: andresnicolas@oac.uncor.edu [Instituto de Astrofísica, Pontificia Universidad Católica de Chile, Av. Vicuña Mackenna 4860, Santiago (Chile)

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.

  3. Analytical challenges in sports drug testing.

    Science.gov (United States)

    Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas

    2018-03-01

    Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.

  4. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  5. Analytical chemistry: Principles and techniques

    International Nuclear Information System (INIS)

    Hargis, L.G.

    1988-01-01

    Although this text seems to have been intended for use in a one-semester course in undergraduate analytical chemistry, it includes the range of topics usually encountered in a two-semester introductory course in chemical analysis. The material is arranged logically for use in a two-semester course: the first 12 chapters contain the subjects most often covered in the first term, and the next 10 chapters pertain to the second (instrumental) term. Overall breadth and level of treatment are standards for an undergraduate text of this sort, and the only major omission is that of kinetic methods (which is a common omission in analytical texts). In the first 12 chapters coverage of the basic material is quite good. The emphasis on the underlying principles of the techniques rather than on specifics and design of instrumentation is welcomed. This text may be more useful for the instrumental portion of an analytical chemistry course than for the solution chemistry segment. The instrumental analysis portion is appropriate for an introductory textbook

  6. Analytical chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-15

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  7. Analytical chemistry

    International Nuclear Information System (INIS)

    Chae, Myeong Hu; Lee, Hu Jun; Kim, Ha Seok

    1989-02-01

    This book give explanations on analytical chemistry with ten chapters, which deal with development of analytical chemistry, the theory of error with definition and classification, sample and treatment gravimetry on general process of gravimetry in aqueous solution and non-aqueous solution, precipitation titration about precipitation reaction and types, complexometry with summary and complex compound, oxidation-reduction equilibrium on electrode potential and potentiometric titration, solvent extraction and chromatograph and experiment with basic operation for chemical experiment.

  8. Analytical chemistry department. Annual report, 1977

    International Nuclear Information System (INIS)

    Knox, E.M.

    1978-09-01

    The annual report describes the analytical methods, analyses and equipment developed or adopted for use by the Analytical Chemistry Department during 1977. The individual articles range from a several page description of development and study programs to brief one paragraph descriptions of methods adopted for use with or without some modification. This year, we have included a list of the methods incorporated into our Analytical Chemistry Methods Manual. This report is organized into laboratory sections within the Department as well as major programs within General Atomic Company. Minor programs and studies are included under Miscellaneous. The analytical and technical support activities for GAC include gamma-ray spectroscopy, radiochemistry, activation analysis, gas chromatography, atomic absorption, spectrophotometry, emission spectroscopy, x-ray diffractometry, electron microprobe, titrimetry, gravimetry, and quality control. Services are provided to all organizations throughout General Atomic Company. The major effort, however, is in support of the research and development programs within HTGR Generic Technology Programs ranging from new fuel concepts, end-of-life studies, and irradiated capsules to fuel recycle studies

  9. Research prioritization using the Analytic Hierarchy Process: basic methods. Volume 1

    International Nuclear Information System (INIS)

    Vesely, W.E.; Shafaghi, A.; Gary, I. Jr.; Rasmuson, D.M.

    1983-08-01

    This report describes a systematic approach for prioritizing research needs and research programs. The approach is formally called the Analytic Hierarchy Process which was developed by T.L. Saaty and is described in several of his texts referenced in the report. The Analytic Hierarchy Process, or AHP for short, has been applied to a wide variety of prioritization problems and has a good record of success as documented in Saaty's texts. The report develops specific guidelines for constructing the hierarchy and for prioritizing the research programs. Specific examples are given to illustrate the steps in the AHP. As part of the work, a computer code has been developed and the use of the code is described. The code allows the prioritizations to be done in a codified and efficient manner; sensitivity and parametric studies can also be straightforwardly performed to gain a better understanding of the prioritization results. Finally, as an important part of the work, an approach is developed which utilizes probabilistic risk analyses (PRAs) to systematically identify and prioritize research needs and research programs. When utilized in an AHP framework, the PRA's which have been performed to date provide a powerful information source for focusing research on those areas most impacting risk and risk uncertainty

  10. The SRS analytical laboratories strategic plan

    International Nuclear Information System (INIS)

    Hiland, D.E.

    1993-01-01

    There is an acute shortage of Savannah River Site (SRS) analytical laboratory capacity to support key Department of Energy (DOE) environmental restoration and waste management (EM) programs while making the transition from traditional defense program (DP) missions as a result of the cessation of the Cold War. This motivated Westinghouse Savannah River Company (WSRC) to develop an open-quotes Analytical Laboratories Strategic Planclose quotes (ALSP) in order to provide appropriate input to SRS operating plans and justification for proposed analytical laboratory projects. The methodology used to develop this plan is applicable to all types of strategic planning

  11. The Promise of a College Scholarship Transforms a District

    Science.gov (United States)

    Ritter, Gary W.; Ash, Jennifer

    2016-01-01

    Promise programs are place-based scholarships, generally tied to a city or school district, offering near-universal access to all living in the "place." While Promise programs share some characteristics with other scholarship programs, they're unique because they seek to change communities and schools. Underlying such promise programs is…

  12. 76 FR 41747 - Protection of Stratospheric Ozone: Extension of Global Laboratory and Analytical Use Exemption...

    Science.gov (United States)

    2011-07-15

    ... these laboratory procedures would be permitted. In the supply chain, ODS distributors would not be able... risks. H. Executive Order 13211: Actions That Significantly Affect Energy Supply, Distribution, or Use... laboratory and analytical uses that have not been already identified by EPA as nonessential. EPA is also...

  13. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    Science.gov (United States)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  14. A REVIEW ON PREDICTIVE ANALYTICS IN DATA MINING

    OpenAIRE

    Arumugam.S

    2016-01-01

    The data mining its main process is to collect, extract and store the valuable information and now-a-days it’s done by many enterprises actively. In advanced analytics, Predictive analytics is the one of the branch which is mainly used to make predictions about future events which are unknown. Predictive analytics which uses various techniques from machine learning, statistics, data mining, modeling, and artificial intelligence for analyzing the current data and to make predictions about futu...

  15. Analytics of biometric data from wearable devices to support teaching and learning activities

    Directory of Open Access Journals (Sweden)

    Francisco de Arriba Pérez

    2016-03-01

    Full Text Available This paper introduces the preliminary results of a piece of research whose main purpose is to take advantage of data collected from wearable devices to support learning processes. This goal is approached through the application of learning analytic techniques. The innovation point is the use of data collected from wearables, that will be used in conjunction with data collected from other sources (e.g. Learning Management Systems, Student Information Systems. The paper reviews the results achieved during the last year about the relationships among biometric data collected from wearables and relevant features described in the educational literature. In this way sleep and stress have been identified as interesting areas that could be informed from data collected in wearables and processed by applying machine learning techniques. Our preliminary results show some initial promising results that need further validation, also these results show an interesting opportunity to support awareness and intervention functionalities.

  16. Analytical calculation of magnet interactions in 3D

    OpenAIRE

    Yonnet , Jean-Paul; Allag , Hicham

    2009-01-01

    International audience; A synthesis of all the analytical expressions of the interaction energy, force components and torque components is presented. It allows the analytical calculation of all the interactions when the magnetizations are in any direction. The 3D analytical expressions are difficult to obtain, but the torque and force expressions are very simple to use.

  17. Analytic Potentials for Realistic Electrodes

    International Nuclear Information System (INIS)

    Barlow, Stephan E.; Taylor, Aimee E.; Swanson, Kenneth R.

    2002-01-01

    Finite difference algorithms are widely used to numerically solve Laplace's equation for electrode structures that are not amendable to analytic treatment. This includes essentially all real situations. However, in many cases, it is desirable to have the solution in an analytic form. A common practice is to 'fit' the numerical solution either by least squares or cubic spline approach. Neither of these approaches is really accurate, nor do they produce unique results. These limitations are avoided by our approach.

  18. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    Science.gov (United States)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify

  19. The role of entry into regional markets in fulfilling brand promise

    Directory of Open Access Journals (Sweden)

    Ali Ghasemi

    2014-12-01

    Full Text Available This paper presents an empirical investigation to study the role of entry into regional markets in fulfilling brand promise. The study designs two questionnaires, one for measuring brand promise and the other for measuring export capabilities, in Likert scale and distributes it among 250 randomly selected producers who were involved in production and development of various products in city of Esfahan, Iran. Cronbach alphas were calculated for brand promise and export capabilities as 0.856 and 0.812, respectively. Using structural equation modeling, the study has detected seven factors including product development, public advocacy, strategic orientation, customer satisfaction, competitive pressures, organizational capabilities and distribution strategies.

  20. Land-use regime shifts: an analytical framework and agenda for future land-use research

    Directory of Open Access Journals (Sweden)

    Navin Ramankutty

    2016-06-01

    Full Text Available A key research frontier in global change research lies in understanding processes of land change to inform predictive models of future land states. We believe that significant advances in the field are hampered by limited attention being paid to critical points of change termed land-use regime shifts. We present an analytical framework for understanding land-use regime shifts. We survey historical events of land change and perform in-depth case studies of soy and shrimp development in Latin America to demonstrate the role of preconditions, triggers, and self-reinforcing processes in driving land-use regime shifts. Whereas the land-use literature demonstrates a good understanding of within-regime dynamics, our understanding of the drivers of land-use regime shifts is limited to ex post facto explications. Theoretical and empirical advances are needed to better understand the dynamics and implications of land-use regime shifts. We draw insights from the regime-shifts literature to propose a research agenda for studying land change.

  1. Analytic solution for American strangle options using Laplace-Carson transforms

    Science.gov (United States)

    Kang, Myungjoo; Jeon, Junkee; Han, Heejae; Lee, Somin

    2017-06-01

    A strangle has been important strategy for options when the trader believes there will be a large movement in the underlying asset but are uncertain of which way the movement will be. In this paper, we derive analytic formula for the price of American strangle options. American strangle options can be mathematically formulated into the free boundary problems involving two early exercise boundaries. By using Laplace-Carson Transform(LCT), we can derive the nonlinear system of equations satisfied by the transformed value of two free boundaries. We then solve this nonlinear system using Newton's method and finally get the free boundaries and option values using numerical Laplace inversion techniques. We also derive the Greeks for the American strangle options as well as the value of perpetual American strangle options. Furthermore, we present various graphs for the free boundaries and option values according to the change of parameters.

  2. Using i2b2 to Bootstrap Rural Health Analytics and Learning Networks.

    Science.gov (United States)

    Harris, Daniel R; Baus, Adam D; Harper, Tamela J; Jarrett, Traci D; Pollard, Cecil R; Talbert, Jeffery C

    2016-08-01

    We demonstrate that the open-source i2b2 (Informatics for Integrating Biology and the Bedside) data model can be used to bootstrap rural health analytics and learning networks. These networks promote communication and research initiatives by providing the infrastructure necessary for sharing data and insights across a group of healthcare and research partners. Data integration remains a crucial challenge in connecting rural healthcare sites with a common data sharing and learning network due to the lack of interoperability and standards within electronic health records. The i2b2 data model acts as a point of convergence for disparate data from multiple healthcare sites. A consistent and natural data model for healthcare data is essential for overcoming integration issues, but challenges such as those caused by weak data standardization must still be addressed. We describe our experience in the context of building the West Virginia/Kentucky Health Analytics and Learning Network, a collaborative, multi-state effort connecting rural healthcare sites.

  3. Analytic Moufang-transformations

    International Nuclear Information System (INIS)

    Paal, Eh.N.

    1988-01-01

    The paper is aimed to be an introduction to the concept of an analytic birepresentation of an analytic Moufang loop. To describe the deviation of (S,T) from associativity, the associators (S,T) are defined and certain constraints for them, called the minimality conditions of (S,T) are established

  4. PROMISING ACCESSIONS OF CHAENOMELES AND THEIR USE IN THE FUNCTIONAL FOOD

    Directory of Open Access Journals (Sweden)

    V. N. Sorokopudov

    2017-01-01

    Full Text Available A complex analysis is crucial for obtaining new resistant varieties and developing recommendations for the use of the fruit of Chaenomeles. The task of this study was to assess the productivity and quality of the fruit of the selected forms of Chaenomeles in Central Russia, with the determination of the possibility of a no-waste technology for fruit processing, and the appropriateness of using functional food in the composition of products. The studies were conducted in 2012-2016. In the Botanical Gardens of the National Research University "BelGU" (Belgorod, in the FGBNU VSTISP and GBS N.V. Tsitsina. As materials for the study, 6 selective forms of Chaenomeles, obtained from free pollination of the ‘Calif’ variety, used as a control, were used. The study was carried out according to the generally accepted methodology of varietal studies, along with using the authors' methodical development. A sufficiently high nutritional and biological value of the chelating of the Chaenomeles fruit has been observed. At the same time, the minerals, carbohydrates and vitamins from entire fruits exceed the content of ones squeezed from the pulp. The obtained results of the studies allow us to conclude that it is advisable to organize a non-waste technology for the processing of fruit of Chaenomeles, which can serve as one of the components for the enrichment of food products. Thus, a comprehensive assessment of the biological properties and productivity of breeding forms of Chaenomeles has been made showing that they exceeded the parent variety in stability are regarded as promising intense-vitamin fruit culture, that can be used for various processing methods, as part of functional and therapeutic product-prophylactic nutrition, especially in obtaining natural low-calorie foods.

  5. Decision support for selecting exportable nuclear technology using the analytic hierarchy process: A Korean case

    International Nuclear Information System (INIS)

    Lee, Deok Joo; Hwang, Jooho

    2010-01-01

    The Korean government plans to increase strategically focused R and D investment in some promising nuclear technology areas to create export opportunities of technology in a global nuclear market. The purpose of this paper is to present a decision support process for selecting promising nuclear technology with the perspective of exportability by using the AHP based on extensive data gathered from nuclear experts in Korea. In this study, the decision criteria for evaluating the export competitiveness of nuclear technologies were determined, and a hierarchical structure for the decision-making process was systematically developed. Subsequently relative weights of decision criteria were derived using AHP methodology and the export competitiveness of nuclear technology alternatives was quantified to prioritize them. We discuss the implications of our results with a viewpoint toward national nuclear technology policy.

  6. Measuring myokines with cardiovascular functions: pre-analytical variables affecting the analytical output.

    Science.gov (United States)

    Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe

    2017-08-01

    In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.

  7. Understanding, Evaluating, and Supporting Self-Regulated Learning Using Learning Analytics

    Science.gov (United States)

    Roll, Ido; Winne, Philip H.

    2015-01-01

    Self-regulated learning is an ongoing process rather than a single snapshot in time. Naturally, the field of learning analytics, focusing on interactions and learning trajectories, offers exciting opportunities for analyzing and supporting self-regulated learning. This special section highlights the current state of research at the intersection of…

  8. The Analytical Evaluation Of Three-Center Magnetic Multipole Moment Integrals By Using Slater Type Orbitals

    International Nuclear Information System (INIS)

    Oztekin, E.

    2010-01-01

    In this study, magnetic multipole moment integrals are calculated by using Slater type orbitals (STOs), Fourier transform and translation formulas. Firstly, multipole moment operators which appear in the three-center magnetic multipole moment integrals are translated to b-center from 0-center. So, three-center magnetic multipole moment integrals have been reduced to the two-center. Then, the obtained analytical expressions have been written in terms of overlap integrals. When the magnetic multipole moment integrals calculated, matrix representations for x-, y- and z-components of multipole moments was composed and every component was separately calculated to analytically. Consequently, magnetic multipole moment integrals are also given in terms of the same and different screening parameters.

  9. Real-Time Analytics for the Healthcare Industry: Arrhythmia Detection.

    Science.gov (United States)

    Agneeswaran, Vijay Srinivas; Mukherjee, Joydeb; Gupta, Ashutosh; Tonpay, Pranay; Tiwari, Jayati; Agarwal, Nitin

    2013-09-01

    It is time for the healthcare industry to move from the era of "analyzing our health history" to the age of "managing the future of our health." In this article, we illustrate the importance of real-time analytics across the healthcare industry by providing a generic mechanism to reengineer traditional analytics expressed in the R programming language into Storm-based real-time analytics code. This is a powerful abstraction, since most data scientists use R to write the analytics and are not clear on how to make the data work in real-time and on high-velocity data. Our paper focuses on the applications necessary to a healthcare analytics scenario, specifically focusing on the importance of electrocardiogram (ECG) monitoring. A physician can use our framework to compare ECG reports by categorization and consequently detect Arrhythmia. The framework can read the ECG signals and uses a machine learning-based categorizer that runs within a Storm environment to compare different ECG signals. The paper also presents some performance studies of the framework to illustrate the throughput and accuracy trade-off in real-time analytics.

  10. Transition Analysis of Promising U.S. Future Fuel Cycles Using ORION

    International Nuclear Information System (INIS)

    Sunny, Eva E.; Worrall, Andrew; Peterson, Joshua L.; Powers, Jeffrey J.; Gehin, Jess C.; Gregg, Robert

    2015-01-01

    The US Department of Energy Office of Fuel Cycle Technologies performed an evaluation and screening (E&S) study of nuclear fuel cycle options to help prioritize future research and development decisions. Previous work for this E&S study focused on establishing equilibrium conditions for analysis examples of 40 nuclear fuel cycle evaluation groups (EGs) and evaluating their performance according to a set of 22 standardized metrics. Following the E&S study, additional studies are being conducted to assess transitioning from the current US fuel cycle to future fuel cycle options identified by the E&S study as being most promising. These studies help inform decisions on how to effectively achieve full transition, estimate the length of time needed to undergo transition from the current fuel cycle, and evaluate performance of nuclear systems and facilities in place during the transition. These studies also help identify any barriers to achieve transition. Oak Ridge National Laboratory (ORNL) Fuel Cycle Options Campaign team used ORION to analyze the transition pathway from the existing US nuclear fuel cycle—the once-through use of low-enriched-uranium (LEU) fuel in thermal-spectrum light water reactors (LWRs)—to a new fuel cycle with continuous recycling of plutonium and uranium in sodium fast reactors (SFRs). This paper discusses the analysis of the transition from an LWR to an SFR fleet using ORION, highlights the role of lifetime extensions of existing LWRs to aid transition, and discusses how a slight delay in SFR deployment can actually reduce the time to achieve an equilibrium fuel cycle.

  11. Big data analytics turning big data into big money

    CERN Document Server

    Ohlhorst, Frank J

    2012-01-01

    Unique insights to implement big data analytics and reap big returns to your bottom line Focusing on the business and financial value of big data analytics, respected technology journalist Frank J. Ohlhorst shares his insights on the newly emerging field of big data analytics in Big Data Analytics. This breakthrough book demonstrates the importance of analytics, defines the processes, highlights the tangible and intangible values and discusses how you can turn a business liability into actionable material that can be used to redefine markets, improve profits and identify new business opportuni

  12. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  13. The Radical Promise of Reformist Zeal: What Makes "Inquiry for Equity" Plausible?

    Science.gov (United States)

    Lashaw, Amanda

    2010-01-01

    Education reform movements often promise more than they deliver. Why are such promises plausible in light of seemingly perpetual education reform? Drawing on ethnographic fieldwork based in a nonprofit education reform organization, this article explores the appeal of popular notions about "using data to close the racial achievement…

  14. An Analysis of Machine- and Human-Analytics in Classification.

    Science.gov (United States)

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  15. Using Learning Analytics to Understand Scientific Modeling in the Classroom

    Directory of Open Access Journals (Sweden)

    David Quigley

    2017-11-01

    Full Text Available Scientific models represent ideas, processes, and phenomena by describing important components, characteristics, and interactions. Models are constructed across various scientific disciplines, such as the food web in biology, the water cycle in Earth science, or the structure of the solar system in astronomy. Models are central for scientists to understand phenomena, construct explanations, and communicate theories. Constructing and using models to explain scientific phenomena is also an essential practice in contemporary science classrooms. Our research explores new techniques for understanding scientific modeling and engagement with modeling practices. We work with students in secondary biology classrooms as they use a web-based software tool—EcoSurvey—to characterize organisms and their interrelationships found in their local ecosystem. We use learning analytics and machine learning techniques to answer the following questions: (1 How can we automatically measure the extent to which students’ scientific models support complete explanations of phenomena? (2 How does the design of student modeling tools influence the complexity and completeness of students’ models? (3 How do clickstreams reflect and differentiate student engagement with modeling practices? We analyzed EcoSurvey usage data collected from two different deployments with over 1,000 secondary students across a large urban school district. We observe large variations in the completeness and complexity of student models, and large variations in their iterative refinement processes. These differences reveal that certain key model features are highly predictive of other aspects of the model. We also observe large differences in student modeling practices across different classrooms and teachers. We can predict a student’s teacher based on the observed modeling practices with a high degree of accuracy without significant tuning of the predictive model. These results highlight

  16. Quo vadis, analytical chemistry?

    Science.gov (United States)

    Valcárcel, Miguel

    2016-01-01

    This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.

  17. The analyst's participation in the analytic process.

    Science.gov (United States)

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  18. Analytic manifolds in uniform algebras

    International Nuclear Information System (INIS)

    Tonev, T.V.

    1988-12-01

    Here we extend Bear-Hile's result concerning the version of famous Bishop's theorem for one-dimensional analytic structures in two directions: for n-dimensional complex analytic manifolds, n>1, and for generalized analytic manifolds. 14 refs

  19. Framework for pedagogical learning analytics

    OpenAIRE

    Heilala, Ville

    2018-01-01

    Learning analytics is an emergent technological practice and a multidisciplinary scientific discipline, which goal is to facilitate effective learning and knowledge of learning. In this design science research, I combine knowledge discovery process, a concept of pedagogical knowledge, ethics of learning analytics and microservice architecture. The result is a framework for pedagogical learning analytics. The framework is applied and evaluated in the context of agency analytics. The framework ...

  20. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  1. Repurification and characterization of extractant mixture (isobutyl acetate-methyl isobutyl ketone) used in spectrophotometric analytical methods

    International Nuclear Information System (INIS)

    Al-Merey, R.; Al-Hameish, M.

    2001-01-01

    Isobutyl acetate (IBA) -methyl isobutyl ketone (MIBK) mixture used in analytical laboratories was re-purified by fractional distillation. The used mixture was washed with 0.5 M Na 2 CO 3 solution for the removal of inorganic substances. The range of fractional distillation was between 111-114 Centigrade which gave an azeotropic mixture that consists of 70% of IBA, 20% of MIBK and 10% of isobutanol (IBL). Gas chromatography (GC) analysis showed that isobutanol was increased by about 10% on the expense of IBA. This study suggests that MIBK could be determined in organic mixture spectrophotometrically. The analytical function of the re-purified mixture is found to be better than the unused mixture. Finally the distillation recovery was 93%. (author)

  2. An analytical solution for improved HIFU SAR estimation

    International Nuclear Information System (INIS)

    Dillon, C R; Vyas, U; Christensen, D A; Roemer, R B; Payne, A

    2012-01-01

    Accurate determination of the specific absorption rates (SARs) present during high intensity focused ultrasound (HIFU) experiments and treatments provides a solid physical basis for scientific comparison of results among HIFU studies and is necessary to validate and improve SAR predictive software, which will improve patient treatment planning, control and evaluation. This study develops and tests an analytical solution that significantly improves the accuracy of SAR values obtained from HIFU temperature data. SAR estimates are obtained by fitting the analytical temperature solution for a one-dimensional radial Gaussian heating pattern to the temperature versus time data following a step in applied power and evaluating the initial slope of the analytical solution. The analytical method is evaluated in multiple parametric simulations for which it consistently (except at high perfusions) yields maximum errors of less than 10% at the center of the focal zone compared with errors up to 90% and 55% for the commonly used linear method and an exponential method, respectively. For high perfusion, an extension of the analytical method estimates SAR with less than 10% error. The analytical method is validated experimentally by showing that the temperature elevations predicted using the analytical method's SAR values determined for the entire 3D focal region agree well with the experimental temperature elevations in a HIFU-heated tissue-mimicking phantom. (paper)

  3. Chinese Culture, Homosexuality Stigma, Social Support and Condom Use: A Path Analytic Model.

    Science.gov (United States)

    Liu, Hongjie; Feng, Tiejian; Ha, Toan; Liu, Hui; Cai, Yumao; Liu, Xiaoli; Li, Jian

    2011-01-01

    PURPOSE: The objective of this study was to examine the interrelationships among individualism, collectivism, homosexuality-related stigma, social support, and condom use among Chinese homosexual men. METHODS: A cross-sectional study using the respondent-driven sampling approach was conducted among 351 participants in Shenzhen, China. Path analytic modeling was used to analyze the interrelationships. RESULTS: The results of path analytic modeling document the following statistically significant associations with regard to homosexuality: (1) higher levels of vertical collectivism were associated with higher levels of public stigma [β (standardized coefficient) = 0.12] and self stigma (β = 0.12); (2) higher levels of vertical individualism were associated with higher levels self stigma (β = 0.18); (3) higher levels of horizontal individualism were associated with higher levels of public stigma (β = 0.12); (4) higher levels of self stigma were associated with higher levels of social support from sexual partners (β = 0.12); and (5) lower levels of public stigma were associated with consistent condom use (β = -0.19). CONCLUSIONS: The findings enhance our understanding of how individualist and collectivist cultures influence the development of homosexuality-related stigma, which in turn may affect individuals' decisions to engage in HIV-protective practices and seek social support. Accordingly, the development of HIV interventions for homosexual men in China should take the characteristics of Chinese culture into consideration.

  4. Chinese Culture, Homosexuality Stigma, Social Support and Condom Use: A Path Analytic Model

    Science.gov (United States)

    Liu, Hongjie; Feng, Tiejian; Ha, Toan; Liu, Hui; Cai, Yumao; Liu, Xiaoli; Li, Jian

    2011-01-01

    Purpose The objective of this study was to examine the interrelationships among individualism, collectivism, homosexuality-related stigma, social support, and condom use among Chinese homosexual men. Methods A cross-sectional study using the respondent-driven sampling approach was conducted among 351 participants in Shenzhen, China. Path analytic modeling was used to analyze the interrelationships. Results The results of path analytic modeling document the following statistically significant associations with regard to homosexuality: (1) higher levels of vertical collectivism were associated with higher levels of public stigma [β (standardized coefficient) = 0.12] and self stigma (β = 0.12); (2) higher levels of vertical individualism were associated with higher levels self stigma (β = 0.18); (3) higher levels of horizontal individualism were associated with higher levels of public stigma (β = 0.12); (4) higher levels of self stigma were associated with higher levels of social support from sexual partners (β = 0.12); and (5) lower levels of public stigma were associated with consistent condom use (β = −0.19). Conclusions The findings enhance our understanding of how individualist and collectivist cultures influence the development of homosexuality-related stigma, which in turn may affect individuals’ decisions to engage in HIV-protective practices and seek social support. Accordingly, the development of HIV interventions for homosexual men in China should take the characteristics of Chinese culture into consideration. PMID:21731850

  5. Advanced web metrics with Google Analytics

    CERN Document Server

    Clifton, Brian

    2012-01-01

    Get the latest information about using the #1 web analytics tool from this fully updated guide Google Analytics is the free tool used by millions of web site owners to assess the effectiveness of their efforts. Its revised interface and new features will offer even more ways to increase the value of your web site, and this book will teach you how to use each one to best advantage. Featuring new content based on reader and client requests, the book helps you implement new methods and concepts, track social and mobile visitors, use the new multichannel funnel reporting features, understand which

  6. Design of laser-generated shockwave experiments. An approach using analytic models

    International Nuclear Information System (INIS)

    Lee, Y.T.; Trainor, R.J.

    1980-01-01

    Two of the target-physics phenomena which must be understood before a clean experiment can be confidently performed are preheating due to suprathermal electrons and shock decay due to a shock-rarefaction interaction. Simple analytic models are described for these two processes and the predictions of these models are compared with those of the LASNEX fluid physics code. We have approached this work not with the view of surpassing or even approaching the reliability of the code calculations, but rather with the aim of providing simple models which may be used for quick parameter-sensitivity evaluations, while providing physical insight into the problems

  7. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Beaver, Justin M [ORNL; BogenII, Paul L. [Google Inc.; Drouhard, Margaret MEG G [ORNL; Pyle, Joshua M [ORNL

    2015-01-01

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing these contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.

  8. Analytical bounds on SET charge sensitivity for qubit readout in a solid-state quantum computer

    International Nuclear Information System (INIS)

    Green, F.; Buehler, T.M.; Brenner, R.; Hamilton, A.R.; Dzurak, A.S.; Clark, R.G.

    2002-01-01

    Full text: Quantum Computing promises processing powers orders of magnitude beyond what is possible in conventional silicon-based computers. It harnesses the laws of quantum mechanics directly, exploiting the in built potential of a wave function for massively parallel information processing. Highly ordered and scaleable arrays of single donor atoms (quantum bits, or qubits), embedded in Si, are especially promising; they are a very natural fit to the existing, highly sophisticated, Si industry. The success of Si-based quantum computing depends on precisely initializing the quantum state of each qubit, and on precise reading out its final form. In the Kane architecture the qubit states are read out by detecting the spatial distribution of the donor's electron cloud using a sensitive electrometer. The single-electron transistor (SET) is an attractive candidate readout device for this, since the capacitive, or charging, energy of a SET's metallic central island is exquisitely sensitive to its electronic environment. Use of SETs as high-performance electrometers is therefore a key technology for data transfer in a solid-state quantum computer. We present an efficient analytical method to obtain bounds on the charge sensitivity of a single electron transistor (SET). Our classic Green-function analysis provides reliable estimates of SET sensitivity optimizing the design of the readout hardware. Typical calculations, and their physical meaning, are discussed. We compare them with the measured SET-response data

  9. Use of nuclear and related analytical techniques in environmental research as exemplified by selected air pollution studies

    International Nuclear Information System (INIS)

    Smodis, B.; Jacimovic, R.; Jeran, Z.; Stropnik, B.; Svetina, M.

    2000-01-01

    Among nuclear and nuclear related analytical techniques, neutron activation analysis and X-ray fluorescence spectrometry proved to be particularly useful for environmental studies owing to their nondestructive character and multi element capability. This paper emphasizes their importance among other multielement analytical methods by discussing their specific role due to specific physics basis, quite different to other destructive non-nuclear methods, and by summarizing results obtained in several studies related to air pollution research, including analyses of airborne particulate matter, water samples, lichens and mosses. (author)

  10. An Exploratory Study to Assess Analytical and Logical Thinking Skills of the Software Practitioners using a Gamification Perspective

    Directory of Open Access Journals (Sweden)

    Şahin KAYALI

    2016-12-01

    Full Text Available The link between analytical and logical thinking skills and success of software practitioners attracted an increasing attention in the last decade. Several studies report that the ability to think logically is a requirement for improving software development skills, which exhibits a strong reasoning. Additionally, analytical thinking is a vital part of software development for example while dividing a task into elemental parts with respect to basic rules and principles.  Using the basic essence of gamification, this study proposes a mobile testing platform for assessing analytical and logical thinking skills of software practitioners as well as computer engineering students. The assessment questions were taken from the literature and transformed into a gamified tool based on the software requirements. A focus group study was conducted to capture the requirements. Using the Delphi method, these requirements were discussed by a group of experts to reach a multidisciplinary understanding where a level of moderate agreement has been achieved. In light of these, an assessment tool was developed, which was tested on both software practitioners from the industry and senior computer engineering students. Our results suggest that individuals who exhibit skills in analytical and logical thinking are also more inclined to be successful in software development.

  11. Experimental study and analytical model of deformation of magnetostrictive films as applied to mirrors for x-ray space telescopes.

    Science.gov (United States)

    Wang, Xiaoli; Knapp, Peter; Vaynman, S; Graham, M E; Cao, Jian; Ulmer, M P

    2014-09-20

    The desire for continuously gaining new knowledge in astronomy has pushed the frontier of engineering methods to deliver lighter, thinner, higher quality mirrors at an affordable cost for use in an x-ray observatory. To address these needs, we have been investigating the application of magnetic smart materials (MSMs) deposited as a thin film on mirror substrates. MSMs have some interesting properties that make the application of MSMs to mirror substrates a promising solution for making the next generation of x-ray telescopes. Due to the ability to hold a shape with an impressed permanent magnetic field, MSMs have the potential to be the method used to make light weight, affordable x-ray telescope mirrors. This paper presents the experimental setup for measuring the deformation of the magnetostrictive bimorph specimens under an applied magnetic field, and the analytical and numerical analysis of the deformation. As a first step in the development of tools to predict deflections, we deposited Terfenol-D on the glass substrates. We then made measurements that were compared with the results from the analytical and numerical analysis. The surface profiles of thin-film specimens were measured under an external magnetic field with white light interferometry (WLI). The analytical model provides good predictions of film deformation behavior under various magnetic field strengths. This work establishes a solid foundation for further research to analyze the full three-dimensional deformation behavior of magnetostrictive thin films.

  12. Analytical applications of ion exchangers

    CERN Document Server

    Inczédy, J

    1966-01-01

    Analytical Applications of Ion Exchangers presents the laboratory use of ion-exchange resins. This book discusses the development in the analytical application of ion exchangers. Organized into 10 chapters, this book begins with an overview of the history and significance of ion exchangers for technical purposes. This text then describes the properties of ion exchangers, which are large molecular water-insoluble polyelectrolytes having a cross-linked structure that contains ionic groups. Other chapters consider the theories concerning the operation of ion-exchange resins and investigate th

  13. Basic emotion processing and the adolescent brain: Task demands, analytic approaches, and trajectories of changes

    Directory of Open Access Journals (Sweden)

    Larissa B. Del Piero

    2016-06-01

    Full Text Available Early neuroimaging studies suggested that adolescents show initial development in brain regions linked with emotional reactivity, but slower development in brain structures linked with emotion regulation. However, the increased sophistication of adolescent brain research has made this picture more complex. This review examines functional neuroimaging studies that test for differences in basic emotion processing (reactivity and regulation between adolescents and either children or adults. We delineated different emotional processing demands across the experimental paradigms in the reviewed studies to synthesize the diverse results. The methods for assessing change (i.e., analytical approach and cohort characteristics (e.g., age range were also explored as potential factors influencing study results. Few unifying dimensions were found to successfully distill the results of the reviewed studies. However, this review highlights the potential impact of subtle methodological and analytic differences between studies, need for standardized and theory-driven experimental paradigms, and necessity of analytic approaches that are can adequately test the trajectories of developmental change that have recently been proposed. Recommendations for future research highlight connectivity analyses and non-linear developmental trajectories, which appear to be promising approaches for measuring change across adolescence. Recommendations are made for evaluating gender and biological markers of development beyond chronological age.

  14. Prediction-Oriented Marker Selection (PROMISE): With Application to High-Dimensional Regression.

    Science.gov (United States)

    Kim, Soyeon; Baladandayuthapani, Veerabhadran; Lee, J Jack

    2017-06-01

    In personalized medicine, biomarkers are used to select therapies with the highest likelihood of success based on an individual patient's biomarker/genomic profile. Two goals are to choose important biomarkers that accurately predict treatment outcomes and to cull unimportant biomarkers to reduce the cost of biological and clinical verifications. These goals are challenging due to the high dimensionality of genomic data. Variable selection methods based on penalized regression (e.g., the lasso and elastic net) have yielded promising results. However, selecting the right amount of penalization is critical to simultaneously achieving these two goals. Standard approaches based on cross-validation (CV) typically provide high prediction accuracy with high true positive rates but at the cost of too many false positives. Alternatively, stability selection (SS) controls the number of false positives, but at the cost of yielding too few true positives. To circumvent these issues, we propose prediction-oriented marker selection (PROMISE), which combines SS with CV to conflate the advantages of both methods. Our application of PROMISE with the lasso and elastic net in data analysis shows that, compared to CV, PROMISE produces sparse solutions, few false positives, and small type I + type II error, and maintains good prediction accuracy, with a marginal decrease in the true positive rates. Compared to SS, PROMISE offers better prediction accuracy and true positive rates. In summary, PROMISE can be applied in many fields to select regularization parameters when the goals are to minimize false positives and maximize prediction accuracy.

  15. An analytical study of various telecomminication networks using Markov models

    International Nuclear Information System (INIS)

    Ramakrishnan, M; Jayamani, E; Ezhumalai, P

    2015-01-01

    The main aim of this paper is to examine issues relating to the performance of various Telecommunication networks, and applied queuing theory for better design and improved efficiency. Firstly, giving an analytical study of queues deals with quantifying the phenomenon of waiting lines using representative measures of performances, such as average queue length (on average number of customers in the queue), average waiting time in queue (on average time to wait) and average facility utilization (proportion of time the service facility is in use). In the second, using Matlab simulator, summarizes the finding of the investigations, from which and where we obtain results and describing methodology for a) compare the waiting time and average number of messages in the queue in M/M/1 and M/M/2 queues b) Compare the performance of M/M/1 and M/D/1 queues and study the effect of increasing the number of servers on the blocking probability M/M/k/k queue model. (paper)

  16. Measurement of company effectiveness using analytic network process method

    Directory of Open Access Journals (Sweden)

    Goran Janjić

    2017-07-01

    Full Text Available The sustainable development of an organisation is monitored through the organisation’s performance, which beforehand incorporates all stakeholders’ requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP to define the weight factors of the mutual influences of all the important elements of an organisation’s strategy. The calculation of an organisation’s effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation’s business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation’s most important measures.

  17. Analytical models for low-power rectenna design

    NARCIS (Netherlands)

    Akkermans, J.A.G.; Beurden, van M.C.; Doodeman, G.J.N.; Visser, H.J.

    2005-01-01

    The design of a low-cost rectenna for low-power applications is presented. The rectenna is designed with the use of analytical models and closed-form analytical expressions. This allows for a fast design of the rectenna system. To acquire a small-area rectenna, a layered design is proposed.

  18. Analytical Solution of General Bagley-Torvik Equation

    OpenAIRE

    William Labecca; Osvaldo Guimarães; José Roberto C. Piqueira

    2015-01-01

    Bagley-Torvik equation appears in viscoelasticity problems where fractional derivatives seem to play an important role concerning empirical data. There are several works treating this equation by using numerical methods and analytic formulations. However, the analytical solutions presented in the literature consider particular cases of boundary and initial conditions, with inhomogeneous term often expressed in polynomial form. Here, by using Laplace transform methodology, the general inhomoge...

  19. Automation of analytical systems in power cycles

    International Nuclear Information System (INIS)

    Staub Lukas

    2008-01-01

    'Automation' is a widely used term in instrumentation and is often applied to signal exchange, PLC and SCADA systems. Common use, however, does not necessarily described autonomous operation of analytical devices. We define an automated analytical system as a black box with an input (sample) and an output (measured value). In addition we need dedicated status lines for assessing the validities of the input for our black box and the output for subsequent systems. We will discuss input parameters, automated analytical processes and output parameters. Further considerations will be given to signal exchange and integration into the operating routine of a power plant. Local control loops (chemical dosing) and the automation of sampling systems are not discussed here. (author)

  20. The Use of the Analytic Hierarchy Process to Aid Decision Making in Acquired Equinovarus Deformity

    NARCIS (Netherlands)

    van Til, Janine Astrid; Renzenbrink, G.J.; Dolan, J.G.; IJzerman, Maarten Joost

    2008-01-01

    Objective: To increase the transparency of decision making about treatment in patients with equinovarus deformity poststroke. - Design: The analytic hierarchy process (AHP) was used as a structured methodology to study the subjective rationale behind choice of treatment. - Setting: An 8-hour meeting