WorldWideScience

Sample records for promises analytical usefulness

  1. Big data analytics in healthcare: promise and potential.

    Science.gov (United States)

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  2. Clinical laboratory analytics: Challenges and promise for an emerging discipline

    Directory of Open Access Journals (Sweden)

    Brian H Shirts

    2015-01-01

    Full Text Available The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.

  3. Big data analytics to improve cardiovascular care: promise and challenges.

    Science.gov (United States)

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  4. The Promise and Peril of Predictive Analytics in Higher Education: A Landscape Analysis

    Science.gov (United States)

    Ekowo, Manuela; Palmer, Iris

    2016-01-01

    Predictive analytics in higher education is a hot-button topic among educators and administrators as institutions strive to better serve students by becoming more data-informed. In this paper, the authors describe how predictive analytics are used in higher education to identify students who need extra support, steer students in courses they will…

  5. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Behavioural health analytics using mobile phones

    Directory of Open Access Journals (Sweden)

    P. Wlodarczak

    2015-07-01

    Full Text Available Big Data analytics in healthcare has become a very active area of research since it promises to reduce costs and to improve health care quality. Behavioural analytics analyses a patients behavioural patterns with the goal of early detection if a patient becomes symptomatic and triggering treatment even before a disease outbreak happens. Behavioural analytics allows a more precise and personalised treatment and can even monitor whole populations for events such as epidemic outbreaks. With the prevalence of mobile phones, they have been used to monitor the health of patients by analysing their behavioural and movement patterns. Cell phones are always on devices and are usually close to their users. As such they can be used as social sensors to create "automated diaries" of their users. Specialised apps passively collect and analyse user data to detect if a patient shows some deviant behaviour indicating he has become symptomatic. These apps first learn a patients normal daily patterns and alert a health care centre if it detects a deviant behaviour. The health care centre can then call the patient and check on his well-being. These apps use machine learning techniques to for reality mining and predictive analysis. This paper describes some of these techniques that have been adopted recently in eHealth apps.

  7. Using Linked Data in Learning Analytics

    NARCIS (Netherlands)

    d'Aquin, Mathieu; Dietze, Stefan; Drachsler, Hendrik; Herder, Eelco

    2013-01-01

    d'Aquin, M., Dietze, S., Drachsler, H., & Herder, E. (2013, April). Using Linked Data in Learning Analytics. Tutorial given at LAK 2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  8. Analytical use of electron accelerators

    International Nuclear Information System (INIS)

    Kapitsa, S.P.; Chapyzhnikov, B.A.; Firsov, V.I.; Samosyuk, V.N.; Tsipenyuk, Y.M.

    1985-01-01

    After detailed investigation the authors conclude that the newest electron accelerators provide good scope for gamma activation and also for producing neutrons for neutron activation. These accelerators are simpler and safer than reactors, and one can provide fairly homogeneous irradiation of substantial volumes, and the determination speed and sensitivity then constitute the main advantages. The limits of detection and the reproducibility are sufficient to handle a wide range of tasks. Analysts at present face a wide range of unlikely extreme problems, while the selectivity provides exceptional analysis facilities. However, the record examples are not to be taken as exceptions, since activation analysis based on electron accelerators opens up essentially universal scope for analyzing all elements at the concentrations and accuracies currently involved, which will involve its extensive use in analytical practice in the foreseeable future. The authors indicate that the recognition of these possibilities governs the general use of these methods and the employment of current efficient fast-electron sources to implement them

  9. Autonomic urban traffic optimization using data analytics

    OpenAIRE

    Garriga Porqueras, Albert

    2017-01-01

    This work focuses on a smart mobility use case where real-time data analytics on traffic measures is used to improve mobility in the event of a perturbation causing congestion in a local urban area. The data monitored is analysed in order to identify patterns that are used to properly reconfigure traffic lights. The monitoring and data analytics infrastructure is based on a hierarchical distributed architecture that allows placing data analytics processes such as machine learning close to the...

  10. Analytical methods used at model facility

    International Nuclear Information System (INIS)

    Wing, N.S.

    1984-01-01

    A description of analytical methods used at the model LEU Fuel Fabrication Facility is presented. The methods include gravimetric uranium analysis, isotopic analysis, fluorimetric analysis, and emission spectroscopy

  11. Guided Text Search Using Adaptive Visual Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Symons, Christopher T [ORNL; Senter, James K [ORNL; DeNap, Frank A [ORNL

    2012-10-01

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interacts with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.

  12. Using linked data in Learning Analytics

    NARCIS (Netherlands)

    Mathieu, d'Aquin; Stefan, Dietze; Eelco, Herder; Drachsler, Hendrik; Davide, Taibi

    2014-01-01

    d’Aquin, M., Dietze, S., Herder, E., Drachsler, H., & Taibi, D. (2014). Using linked data in learning analytics. eLearning Papers. Nr. 36/2. ISSN: 1887-1542. http://www.openeducationeuropa.eu/en/article/Using-linked-data-in-Learning-Analytics?paper=134810

  13. Pavement Performance : Approaches Using Predictive Analytics

    Science.gov (United States)

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  14. Use of analytical aids for accident management

    International Nuclear Information System (INIS)

    Ward, L.W.

    1991-01-01

    The use of analytical aids by utility technical support teams can enhance the staff's ability to manage accidents. Since instrumentation is exposed to environments beyond design-basis conditions, instruments may provide ambiguous information or may even fail. While it is most likely that many instruments will remain operable, their ability to provide unambiguous information needed for the management of beyond-design-basis events and severe accidents is questionable. Furthermore, given these limitation in instrumentation, the need to ascertain and confirm current plant status and forecast future behavior to effectively manage accidents at nuclear facilities requires a computational capability to simulate the thermal and hydraulic behavior in the primary, secondary, and containment systems. With the need to extend the current preventive approach in accident management to include mitigative actions, analytical aids could be used to further enhance the current capabilities at nuclear facilities. This need for computational or analytical aids is supported based on a review of the candidate accident management strategies discussed in NUREG/CR-5474. Based on the review of the NUREG/CR-5474 strategies, two major analytical aids are considered necessary to support the implementation and monitoring of many of the strategies in this document. These analytical aids include (1) An analytical aid to provide reactor coolant and secondary system behavior under LOCA conditions. (2) An analytical aid to predict containment pressure and temperature response with a steam, air, and noncondensable gas mixture present

  15. Using Learning Analytics for Preserving Academic Integrity

    Science.gov (United States)

    Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena

    2017-01-01

    This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…

  16. Prioritization of Programmer's Productivity Using Analytic Hierarchy ...

    African Journals Online (AJOL)

    This paper focuses on the application of Analytic Hierarchy Process (AHP) model in the context of prioritizing programmer's productivity in University of Benin, Benin City Nigeria. This is achieved by evaluating the way in which the AHP model can be used to select the best programmer for the purpose of developing software ...

  17. Leveraging data rich environments using marketing analytics

    OpenAIRE

    Holtrop, Niels

    2017-01-01

    With the onset of what is popularly known as “big data”, increased attention is being paid to creating value from these data rich environments. Within the field of marketing, the analysis of customer and market data supported by models is known as marketing analytics. The goal of these analyses is to enhance managerial decision making regarding marketing problems. However, before these data rich environments can be used to guide managerial decision making, firms need to grasp the process of d...

  18. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  19. Analytical modeling of worldwide medical radiation use

    International Nuclear Information System (INIS)

    Mettler, F.A. Jr.; Davis, M.; Kelsey, C.A.; Rosenberg, R.; Williams, A.

    1987-01-01

    An analytical model was developed to estimate the availability and frequency of medical radiation use on a worldwide basis. This model includes medical and dental x-ray, nuclear medicine, and radiation therapy. The development of an analytical model is necessary as the first step in estimating the radiation dose to the world's population from this source. Since there is no data about the frequency of medical radiation use in more than half the countries in the world and only fragmentary data in an additional one-fourth of the world's countries, such a model can be used to predict the uses of medical radiation in these countries. The model indicates that there are approximately 400,000 medical x-ray machines worldwide and that approximately 1.2 billion diagnostic medical x-ray examinations are performed annually. Dental x-ray examinations are estimated at 315 million annually and approximately 22 million in-vivo diagnostic nuclear medicine examinations. Approximately 4 million radiation therapy procedures or courses of treatment are undertaken annually

  20. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    Science.gov (United States)

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  1. Online Training of Teachers Using OER: Promises and Potential Strategies

    Science.gov (United States)

    Misra, Pradeep Kumar

    2014-01-01

    Teacher education nowadays needs a change in vision and action to cater to the demands of changing societies. Reforms, improvements, and new approaches in teacher education are an immediate need. Online training of teachers using OER has emerged as a new approach in this direction. This approach is based on the assumption that online training will…

  2. Using Linked Data in Learning Analytics

    NARCIS (Netherlands)

    d'Aquin, Mathieu; Dietze, Stefan; Herder, Eelco; Drachsler, Hendrik; Taibi, David

    2014-01-01

    Learning Analytics has a lot to do with data, and the way to make sense of raw data in terms of the learner’s experience, behaviour and knowledge. In this article, we argue about the need for a closer relationship between the field of Learning Analytics and the one of Linked Data, which in our view

  3. Improving Healthcare Using Big Data Analytics

    Directory of Open Access Journals (Sweden)

    Revanth Sonnati

    2017-03-01

    Full Text Available In daily terms we call the current era as Modern Era which can also be named as the era of Big Data in the field of Information Technology. Our daily lives in todays world are rapidly advancing never quenching ones thirst. The fields of science engineering and technology are producing data at an exponential rate leading to Exabytes of data every day. Big data helps us to explore and re-invent many areas not limited to education health and law. The primary purpose of this paper is to provide an in-depth analysis in the area of Healthcare using the big data and analytics. The main purpose is to emphasize on the usage of the big data which is being stored all the time helping to look back in the history but this is the time to emphasize on the analyzation to improve the medication and services. Although many big data implementations happen to be in-house development this proposed implementation aims to propose a broader extent using Hadoop which just happen to be the tip of the iceberg. The focus of this paper is not limited to the improvement and analysis of the data it also focusses on the strengths and drawbacks compared to the conventional techniques available.

  4. Leveraging data rich environments using marketing analytics

    NARCIS (Netherlands)

    Holtrop, Niels

    2017-01-01

    With the onset of what is popularly known as “big data”, increased attention is being paid to creating value from these data rich environments. Within the field of marketing, the analysis of customer and market data supported by models is known as marketing analytics. The goal of these analyses is

  5. Using Analytic Hierarchy Process in Textbook Evaluation

    Science.gov (United States)

    Kato, Shigeo

    2014-01-01

    This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…

  6. Towards actionable learning analytics using dispositions

    NARCIS (Netherlands)

    Tempelaar, Dirk; Rienties, Bart; Nguyen, Quan

    2017-01-01

    Studies in the field of learning analytics (LA) have shown students’ demographics and learning management system (LMS) data to be effective identifiers of “at risk” performance. However, insights generated by these predictive models may not be suitable for pedagogically informed interventions due to

  7. Towards Actionable Learning Analytics Using Dispositions

    Science.gov (United States)

    Tempelaar, Dirk T.; Rienties, Bart; Nguyen, Quan

    2017-01-01

    Studies in the field of learning analytics (LA) have shown students' demographics and learning management system (LMS) data to be effective identifiers of "at risk" performance. However, insights generated by these predictive models may not be suitable for pedagogically informed interventions due to the inability to explain why students…

  8. Untangling Slab Dynamics Using 3-D Numerical and Analytical Models

    Science.gov (United States)

    Holt, A. F.; Royden, L.; Becker, T. W.

    2016-12-01

    Increasingly sophisticated numerical models have enabled us to make significant strides in identifying the key controls on how subducting slabs deform. For example, 3-D models have demonstrated that subducting plate width, and the related strength of toroidal flow around the plate edge, exerts a strong control on both the curvature and the rate of migration of the trench. However, the results of numerical subduction models can be difficult to interpret, and many first order dynamics issues remain at least partially unresolved. Such issues include the dominant controls on trench migration, the interdependence of asthenospheric pressure and slab dynamics, and how nearby slabs influence each other's dynamics. We augment 3-D, dynamically evolving finite element models with simple, analytical force-balance models to distill the physics associated with subduction into more manageable parts. We demonstrate that for single, isolated subducting slabs much of the complexity of our fully numerical models can be encapsulated by simple analytical expressions. Rates of subduction and slab dip correlate strongly with the asthenospheric pressure difference across the subducting slab. For double subduction, an additional slab gives rise to more complex mantle pressure and flow fields, and significantly extends the range of plate kinematics (e.g., convergence rate, trench migration rate) beyond those present in single slab models. Despite these additional complexities, we show that much of the dynamics of such multi-slab systems can be understood using the physics illuminated by our single slab study, and that a force-balance method can be used to relate intra-plate stress to viscous pressure in the asthenosphere and coupling forces at plate boundaries. This method has promise for rapid modeling of large systems of subduction zones on a global scale.

  9. Using Learning Analytics to Assess Student Learning in Online Courses

    Science.gov (United States)

    Martin, Florence; Ndoye, Abdou

    2016-01-01

    Learning analytics can be used to enhance student engagement and performance in online courses. Using learning analytics, instructors can collect and analyze data about students and improve the design and delivery of instruction to make it more meaningful for them. In this paper, the authors review different categories of online assessments and…

  10. Online Learner Engagement: Opportunities and Challenges with Using Data Analytics

    Science.gov (United States)

    Bodily, Robert; Graham, Charles R.; Bush, Michael D.

    2017-01-01

    This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…

  11. Investigation of Using Analytics in Promoting Mobile Learning Support

    Science.gov (United States)

    Visali, Videhi; Swami, Niraj

    2013-01-01

    Learning analytics can promote pedagogically informed use of learner data, which can steer the progress of technology mediated learning across several learning contexts. This paper presents the application of analytics to a mobile learning solution and demonstrates how a pedagogical sense was inferred from the data. Further, this inference was…

  12. The use of cryogenic helium for classical turbulence: Promises and hurdles

    International Nuclear Information System (INIS)

    Niemela, J.J.; Sreenivasan, K.R.

    2006-12-01

    Fluid turbulence is a paradigm for non-linear systems with many degrees of freedom and important in numerous applications. Because the analytical understanding of the equations of motion is poor, experiments and, lately, direct numerical simulations of the equations of motion, have been fundamental to making progress. In this vein, a concerted experimental effort has been made to take advantage of the unique properties of liquid and gaseous helium at low temperatures near or below the critical point. We discuss the promise and impact of results from recent helium experiments and identify the current technical barriers which can perhaps be removed by low temperature researchers. We focus mainly on classical flows that utilize helium above the lambda line, but touch on those aspects below that exhibit quasi-classical behavior. (author)

  13. Bias Assessment of General Chemistry Analytes using Commutable Samples.

    Science.gov (United States)

    Koerbin, Gus; Tate, Jillian R; Ryan, Julie; Jones, Graham Rd; Sikaris, Ken A; Kanowski, David; Reed, Maxine; Gill, Janice; Koumantakis, George; Yen, Tina; St John, Andrew; Hickman, Peter E; Simpson, Aaron; Graham, Peter

    2014-11-01

    Harmonisation of reference intervals for routine general chemistry analytes has been a goal for many years. Analytical bias may prevent this harmonisation. To determine if analytical bias is present when comparing methods, the use of commutable samples, or samples that have the same properties as the clinical samples routinely analysed, should be used as reference samples to eliminate the possibility of matrix effect. The use of commutable samples has improved the identification of unacceptable analytical performance in the Netherlands and Spain. The International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) has undertaken a pilot study using commutable samples in an attempt to determine not only country specific reference intervals but to make them comparable between countries. Australia and New Zealand, through the Australasian Association of Clinical Biochemists (AACB), have also undertaken an assessment of analytical bias using commutable samples and determined that of the 27 general chemistry analytes studied, 19 showed sufficiently small between method biases as to not prevent harmonisation of reference intervals. Application of evidence based approaches including the determination of analytical bias using commutable material is necessary when seeking to harmonise reference intervals.

  14. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  15. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  16. Use of information technologies in teaching course "Analytical geometry" in higher schools on example of software "ANALYTICAL GEOMETRY"

    OpenAIRE

    V. B. Grigorieva

    2009-01-01

    In article are considered the methodical questions of using of computer technologies, for example, the software "Analytical geometry", in process of teaching course of analytical geometry in the higher school.

  17. Pitfalls and Promises: The Use of Secondary Data Analysis in Educational Research

    Science.gov (United States)

    Smith, Emma

    2008-01-01

    This paper considers the use of secondary data analysis in educational research. It addresses some of the promises and potential pitfalls that influence its use and explores a possible role for the secondary analysis of numeric data in the "new" political arithmetic tradition of social research. Secondary data analysis is a relatively under-used…

  18. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    Science.gov (United States)

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  19. Analytic Methods Used in Quality Control in a Compounding Pharmacy.

    Science.gov (United States)

    Allen, Loyd V

    2017-01-01

    Analytical testing will no doubt become a more important part of pharmaceutical compounding as the public and regulatory agencies demand increasing documentation of the quality of compounded preparations. Compounding pharmacists must decide what types of testing and what amount of testing to include in their quality-control programs, and whether testing should be done in-house or outsourced. Like pharmaceutical compounding, analytical testing should be performed only by those who are appropriately trained and qualified. This article discusses the analytical methods that are used in quality control in a compounding pharmacy. Copyright© by International Journal of Pharmaceutical Compounding, Inc.

  20. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  1. Red mud characterization using nuclear analytical techniques

    International Nuclear Information System (INIS)

    Obhodas, J.; Sudac, D.; Matjacic, L.; Valkovic, V.

    2011-01-01

    Red mud is a toxic waste left as a byproduct in aluminum production Bayer process. Since it contains significant concentrations of other chemical elements interesting for industry, including REE, it is also potential secondary ore source. Recent events in some countries have shown that red mud presents a serious environmental hazard if not properly stored. The subject of our study is the red mud from an ex-aluminum plant in Obrovac, Croatia, left from processing of bauxite mined during late 70's and early 80's at the eastern Adriatic coast and since than stored in open concrete basins for more than 30 years. We have used energy dispersive x-ray fluorescence analysis (both tube and radioactive source excitation), fast neutron activation analysis and passive gamma spectrometry to identify a number of elements present in the red mud, their concentration levels and radioactivity in the red mud. The high concentrations of Al, Si, Ca, Ti and Fe have been measured. Chemical elements Sc, Cr, Mn, Co, Ni, Cu, Zn, Ga, As, Se, Br, Y, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Pb, Th and U were found in lower concentrations. No significant levels of radioactivity have been measured. (authors)

  2. Training the next generation analyst using red cell analytics

    Science.gov (United States)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  3. High-throughput characterization of sediment organic matter by pyrolysis-gas chromatography/mass spectrometry and multivariate curve resolution: A promising analytical tool in (paleo)limnology.

    Science.gov (United States)

    Tolu, Julie; Gerber, Lorenz; Boily, Jean-François; Bindler, Richard

    2015-06-23

    Molecular-level chemical information about organic matter (OM) in sediments helps to establish the sources of OM and the prevalent degradation/diagenetic processes, both essential for understanding the cycling of carbon (C) and of the elements associated with OM (toxic trace metals and nutrients) in lake ecosystems. Ideally, analytical methods for characterizing OM should allow high sample throughput, consume small amounts of sample and yield relevant chemical information, which are essential for multidisciplinary, high-temporal resolution and/or large spatial scale investigations. We have developed a high-throughput analytical method based on pyrolysis-gas chromatography/mass spectrometry and automated data processing to characterize sedimentary OM in sediments. Our method consumes 200 μg of freeze-dried and ground sediment sample. Pyrolysis was performed at 450°C, which was found to avoid degradation of specific biomarkers (e.g., lignin compounds, fresh carbohydrates/cellulose) compared to 650°C, which is in the range of temperatures commonly applied for environmental samples. The optimization was conducted using the top ten sediment samples of an annually resolved sediment record (containing 16-18% and 1.3-1.9% of total carbon and nitrogen, respectively). Several hundred pyrolytic compound peaks were detected of which over 200 were identified, which represent different classes of organic compounds (i.e., n-alkanes, n-alkenes, 2-ketones, carboxylic acids, carbohydrates, proteins, other N compounds, (methoxy)phenols, (poly)aromatics, chlorophyll and steroids/hopanoids). Technical reproducibility measured as relative standard deviation of the identified peaks in triplicate analyses was 5.5±4.3%, with 90% of the RSD values within 10% and 98% within 15%. Finally, a multivariate calibration model was calculated between the pyrolytic degradation compounds and the sediment depth (i.e., sediment age), which is a function of degradation processes and changes in OM

  4. Manufacturing data analytics using a virtual factory representation.

    Science.gov (United States)

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  5. Analytic 3D image reconstruction using all detected events

    International Nuclear Information System (INIS)

    Kinahan, P.E.; Rogers, J.G.

    1988-11-01

    We present the results of testing a previously presented algorithm for three-dimensional image reconstruction that uses all gamma-ray coincidence events detected by a PET volume-imaging scanner. By using two iterations of an analytic filter-backprojection method, the algorithm is not constrained by the requirement of a spatially invariant detector point spread function, which limits normal analytic techniques. Removing this constraint allows the incorporation of all detected events, regardless of orientation, which improves the statistical quality of the final reconstructed image

  6. Fuzzy promises

    DEFF Research Database (Denmark)

    Anker, Thomas Boysen; Kappel, Klemens; Eadie, Douglas

    2012-01-01

    as narrative material to communicate self-identity. Finally, (c) we propose that brands deliver fuzzy experiential promises through effectively motivating consumers to adopt and play a social role implicitly suggested and facilitated by the brand. A promise is an inherently ethical concept and the article...... concludes with an in-depth discussion of fuzzy brand promises as two-way ethical commitments that put requirements on both brands and consumers....

  7. 77 FR 56176 - Analytical Methods Used in Periodic Reporting

    Science.gov (United States)

    2012-09-12

    ... informal rulemaking proceeding to consider changes in analytical principles (Proposals Six and Seven) used... (Proposals Six and Seven), September 4, 2012 (Petition). Proposal Six: Use of Foreign Postal Settlement System as Sole Source for Reporting of Inbound International Revenue, Pieces, and Weights. The Postal...

  8. An Analysis of Earth Science Data Analytics Use Cases

    Science.gov (United States)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  9. Long-Term Prediction of Satellite Orbit Using Analytical Method

    Directory of Open Access Journals (Sweden)

    Jae-Cheol Yoon

    1997-12-01

    Full Text Available A long-term prediction algorithm of geostationary orbit was developed using the analytical method. The perturbation force models include geopotential upto fifth order and degree and luni-solar gravitation, and solar radiation pressure. All of the perturbation effects were analyzed by secular variations, short-period variations, and long-period variations for equinoctial elements such as the semi-major axis, eccentricity vector, inclination vector, and mean longitude of the satellite. Result of the analytical orbit propagator was compared with that of the cowell orbit propagator for the KOREASAT. The comparison indicated that the analytical solution could predict the semi-major axis with an accuarcy of better than ~35meters over a period of 3 month.

  10. The Analytic Information Warehouse (AIW): a platform for analytics using electronic health record data.

    Science.gov (United States)

    Post, Andrew R; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H

    2013-06-01

    To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in 5years of data from our institution's clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. The Analytic Information Warehouse (AIW): a Platform for Analytics using Electronic Health Record Data

    Science.gov (United States)

    Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.

    2013-01-01

    Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960

  12. Using the Analytic Hierarchy Process to Analyze Multiattribute Decisions.

    Science.gov (United States)

    Spires, Eric E.

    1991-01-01

    The use of the Analytic Hierarchy Process (AHP) in assisting researchers to analyze decisions is discussed. The AHP is compared with other decision-analysis techniques, including multiattribute utility measurement, conjoint analysis, and general linear models. Insights that AHP can provide are illustrated with data gathered in an auditing context.…

  13. Optimizing an Immersion ESL Curriculum Using Analytic Hierarchy Process

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-01-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative…

  14. Assessing Adult Learning Preferences Using the Analytic Hierarchy Process.

    Science.gov (United States)

    Lee, Doris; McCool, John; Napieralski, Laura

    2000-01-01

    Graduate students (n=134) used the analytic hierarchy process, which weights expressed preferences, to rate four learning activities: lectures, discussion/reflection, individual projects, and group projects. Their preferences for discussion/reflection and individual projects were independent of auditory, visual, and kinesthetic learning styles.…

  15. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  16. Mapping debris flow susceptibility using analytical network process ...

    Indian Academy of Sciences (India)

    Evangelin Ramani Sujatha

    2017-11-23

    Nov 23, 2017 ... methods known as the analytical network process (ANP) is used to map the ..... ciated in any prospective way, through feedbacks ..... slide susceptibility by means of multivariate statistical .... and bivariate statistics: A case study in southern Italy;. Nat. ... combination applied to Tevankarai Stream Watershed,.

  17. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    Science.gov (United States)

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  18. Analytics that Inform the University: Using Data You Already Have

    Science.gov (United States)

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  19. Evaluating Modeling Sessions Using the Analytic Hierarchy Process

    NARCIS (Netherlands)

    Ssebuggwawo, D.; Hoppenbrouwers, S.J.B.A.; Proper, H.A.; Persson, A.; Stirna, J.

    2008-01-01

    In this paper, which is methodological in nature, we propose to use an established method from the field of Operations Research, the Analytic Hierarchy Process (AHP), in the integrated, stakeholder- oriented evaluation of enterprise modeling sessions: their language, pro- cess, tool (medium), and

  20. Seamless Digital Environment – Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-08-01

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.

  1. The Promise and Perils of Using Big Data in the Study of Corporate Networks

    DEFF Research Database (Denmark)

    Heemskerk, Eelke; Young, Kevin; Takes, Frank W.

    2018-01-01

    problems. While acknowledging that different research questions require different approaches to data quality, we offer a schematic platform that researchers can follow to make informed and intelligent decisions about BCND issues and address these through a specific work-flow procedure. For each step...... challenges associated with the nature of the subject matter, variable data quality and other problems associated with currently available data on this scale, we discuss the promise and perils of using big corporate network data (BCND). We propose a standard procedure for helping researchers deal with BCND...

  2. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    Science.gov (United States)

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  3. Use of scientometrics to assess nuclear and other analytical methods

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1986-01-01

    Scientometrics involves the use of quantitative methods to investigate science viewed as an information process. Scientometric studies can be useful in ascertaining which methods have been most employed for various analytical determinations as well as for predicting which methods will continue to be used in the immediate future and which appear to be losing favor with the analytical community. Published papers in the technical literature are the primary source materials for scientometric studies; statistical methods and computer techniques are the tools. Recent studies have included growth and trends in prompt nuclear analysis impact of research published in a technical journal, and institutional and national representation, speakers and topics at several IAEA conferences, at modern trends in activation analysis conferences, and at other non-nuclear oriented conferences. Attempts have also been made to predict future growth of various topics and techniques. 13 refs., 4 figs., 17 tabs

  4. Use of scientometrics to assess nuclear and other analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Lyon, W.S.

    1986-01-01

    Scientometrics involves the use of quantitative methods to investigate science viewed as an information process. Scientometric studies can be useful in ascertaining which methods have been most employed for various analytical determinations as well as for predicting which methods will continue to be used in the immediate future and which appear to be losing favor with the analytical community. Published papers in the technical literature are the primary source materials for scientometric studies; statistical methods and computer techniques are the tools. Recent studies have included growth and trends in prompt nuclear analysis impact of research published in a technical journal, and institutional and national representation, speakers and topics at several IAEA conferences, at modern trends in activation analysis conferences, and at other non-nuclear oriented conferences. Attempts have also been made to predict future growth of various topics and techniques. 13 refs., 4 figs., 17 tabs.

  5. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  6. Incident detection and isolation in drilling using analytical redundancy relations

    DEFF Research Database (Denmark)

    Willersrud, Anders; Blanke, Mogens; Imsland, Lars

    2015-01-01

    must be avoided. This paper employs model-based diagnosis using analytical redundancy relations to obtain residuals which are affected differently by the different incidents. Residuals are found to be non-Gaussian - they follow a multivariate t-distribution - hence, a dedicated generalized likelihood...... measurements available. In the latter case, isolation capability is shown to be reduced to group-wise isolation, but the method would still detect all serious events with the prescribed false alarm probability...

  7. Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)

    Science.gov (United States)

    Bishop, M. P.; Houser, C.; Lemmons, K.

    2015-12-01

    Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.

  8. FORECASTING PILE SETTLEMENT ON CLAYSTONE USING NUMERICAL AND ANALYTICAL METHODS

    Directory of Open Access Journals (Sweden)

    Ponomarev Andrey Budimirovich

    2016-06-01

    Full Text Available In the article the problem of designing pile foundations on claystones is reviewed. The purpose of this paper is comparative analysis of the analytical and numerical methods for forecasting the settlement of piles on claystones. The following tasks were solved during the study: 1 The existing researches of pile settlement are analyzed; 2 The characteristics of experimental studies and the parameters for numerical modeling are presented, methods of field research of single piles’ operation are described; 3 Calculation of single pile settlement is performed using numerical methods in the software package Plaxis 2D and analytical method according to the requirements SP 24.13330.2011; 4 Experimental data is compared with the results of analytical and numerical calculations; 5 Basing on these results recommendations for forecasting pile settlement on claystone are presented. Much attention is paid to the calculation of pile settlement considering the impacted areas in ground space beside pile and the comparison with the results of field experiments. Basing on the obtained results, for the prediction of settlement of single pile on claystone the authors recommend using the analytical method considered in SP 24.13330.2011 with account for the impacted areas in ground space beside driven pile. In the case of forecasting the settlement of single pile on claystone by numerical methods in Plaxis 2D the authors recommend using the Hardening Soil model considering the impacted areas in ground space beside the driven pile. The analyses of the results and calculations are presented for examination and verification; therefore it is necessary to continue the research work of deep foundation at another experimental sites to improve the reliability of the calculation of pile foundation settlement. The work is of great interest for geotechnical engineers engaged in research, design and construction of pile foundations.

  9. Improving acute kidney injury diagnostics using predictive analytics.

    Science.gov (United States)

    Basu, Rajit K; Gist, Katja; Wheeler, Derek S

    2015-12-01

    Acute kidney injury (AKI) is a multifactorial syndrome affecting an alarming proportion of hospitalized patients. Although early recognition may expedite management, the ability to identify patients at-risk and those suffering real-time injury is inconsistent. The review will summarize the recent reports describing advancements in the area of AKI epidemiology, specifically focusing on risk scoring and predictive analytics. In the critical care population, the primary underlying factors limiting prediction models include an inability to properly account for patient heterogeneity and underperforming metrics used to assess kidney function. Severity of illness scores demonstrate limited AKI predictive performance. Recent evidence suggests traditional methods for detecting AKI may be leveraged and ultimately replaced by newer, more sophisticated analytical tools capable of prediction and identification: risk stratification, novel AKI biomarkers, and clinical information systems. Additionally, the utility of novel biomarkers may be optimized through targeting using patient context, and may provide more granular information about the injury phenotype. Finally, manipulation of the electronic health record allows for real-time recognition of injury. Integrating a high-functioning clinical information system with risk stratification methodology and novel biomarker yields a predictive analytic model for AKI diagnostics.

  10. Seamless Digital Environment - Plan for Data Analytics Use Case Study

    International Nuclear Information System (INIS)

    Oxstrand, Johanna Helene; Bly, Aaron Douglas

    2016-01-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a ''one stop shop'' application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the ''siloed'' data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team's control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  11. Determination of uranium in ground water using different analytical techniques

    International Nuclear Information System (INIS)

    Sahu, S.K.; Maity, Sukanta; Bhangare, R.C.; Pandit, G.G.; Sharma, D.N.

    2014-10-01

    The concern over presence of natural radionuclides like uranium in drinking water is growing recently. The contamination of aquifers with radionuclides depends on number of factors. The geology of an area is the most important factor along with anthropogenic activities like mining, coal ash disposal from thermal power plants, use of phosphate fertilizers etc. Whatever may be the source, the presence of uranium in drinking waters is a matter of great concern for public health. Studies show that uranium is a chemo-toxic and nephrotoxic heavy metal. This chemotoxicity affects the kidneys and bones in particular. Seeing the potential health hazards from natural radionuclides in drinking water, many countries worldwide have adopted the guideline activity concentration for drinking water quality recommended by the WHO (2011). For uranium, WHO has set a limit of 30μgL-1 in drinking water. The geological distribution of uranium and its migration in environment is of interest because the element is having environmental and exposure concerns. It is of great interest to use an analytical technique for uranium analysis in water which is highly sensitive especially at trace levels, specific and precise in presence of other naturally occurring major and trace metals and needs small amount of sample. Various analytical methods based on the use of different techniques have been developed in the past for the determination of uranium in the geological samples. The determination of uranium requires high selectivity due to its strong association with other elements. Several trace level wet chemistry analytical techniques have been reported for uranium determination, but most of these involve tedious and pain staking procedures, high detection limits, interferences etc. Each analytical technique has its own merits and demerits. Comparative assessment by different techniques can provide better quality control and assurance. In present study, uranium was analysed in ground water samples

  12. Spin-Stabilized Spacecrafts: Analytical Attitude Propagation Using Magnetic Torques

    Directory of Open Access Journals (Sweden)

    Roberta Veloso Garcia

    2009-01-01

    Full Text Available An analytical approach for spin-stabilized satellites attitude propagation is presented, considering the influence of the residual magnetic torque and eddy currents torque. It is assumed two approaches to examine the influence of external torques acting during the motion of the satellite, with the Earth's magnetic field described by the quadripole model. In the first approach is included only the residual magnetic torque in the motion equations, with the satellites in circular or elliptical orbit. In the second approach only the eddy currents torque is analyzed, with the satellite in circular orbit. The inclusion of these torques on the dynamic equations of spin stabilized satellites yields the conditions to derive an analytical solution. The solutions show that residual torque does not affect the spin velocity magnitude, contributing only for the precession and the drift of the spacecraft's spin axis and the eddy currents torque causes an exponential decay of the angular velocity magnitude. Numerical simulations performed with data of the Brazilian Satellites (SCD1 and SCD2 show the period that analytical solution can be used to the attitude propagation, within the dispersion range of the attitude determination system performance of Satellite Control Center of Brazil National Research Institute.

  13. Promising designs of compact heat exchangers for modular HTRs using the Brayton cycle

    International Nuclear Information System (INIS)

    Pra, Franck; Tochon, Patrice; Mauget, Christian; Fokkens, Jan; Willemsen, Sander

    2008-01-01

    The presented study was carried out within the Work Package 2 'Recuperator' of the High Temperature Reactor-E European program. High Temperature gas cooled Reactor concepts with a direct cycle have become potentially interesting for the future. Theoretically, these concepts provide higher efficiency than a classical steam cycle. Within the Brayton cycle the helium/helium recuperator, required to achieve the high efficiency, has to work under very harsh conditions (temperature, pressure, and pressure difference between circuits). Within the project the most promising technologies for the compact recuperator were investigated. First, the requirements for the recuperator to operate under the direct Brayton cycle have been defined. Based on these requirements the various potential technologies available on the market have been investigated. Two particular technologies (HEATRIC Printed Circuit Heat Exchanger, NORDON plate fin concept) have been selected as most promising. For the former, a precise description has been given and a mock-up has been fabricated and tested in the Claire loop at CEA. In the Claire loop the Printed Circuit Heat Exchanger mock-up has been subjected to thermal shocks, which are considered to be representative for a recuperator. Prior to the experimental testing coupled Computational Fluid Dynamic (CFD) and Finite Element analyses have been performed to give insight into the thermal and mechanical behaviour of the mock-ups during the thermal shock. Based on these results the experimental measuring program has been optimized. Upon completion of the tests the experimental and numerical results have been compared. Based on the results from the investigation performed recommendations are given for the full-size recuperator using the selected technologies

  14. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    Science.gov (United States)

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  15. Adolescent Cellphone Use While Driving: An Overview of the Literature and Promising Future Directions for Prevention

    Directory of Open Access Journals (Sweden)

    M. Kit Delgado

    2016-06-01

    Full Text Available Motor vehicle crashes are the leading cause of death in adolescents, and drivers aged 16–19 are the most likely to die in distracted driving crashes. This paper provides an overview of the literature on adolescent cellphone use while driving, focusing on the crash risk, incidence, risk factors for engagement, and the effectiveness of current mitigation strategies. We conclude by discussing promising future approaches to prevent crashes related to cellphone use in adolescents. Handheld manipulation of the phone while driving has been shown to have a 3 to 4-fold increased risk of a near crash or crash, and eye glance duration greater than 2 seconds increases crash risk exponentially. Nearly half of U.S. high school students admit to texting while driving in the last month, but the frequency of use according to vehicle speed and high-risk situations remains unknown. Several risk factors are associated with cell phone use while driving including: parental cellphone use while driving, social norms for quick responses to text messages, and higher levels of temporal discounting. Given the limited effectiveness of current mitigation strategies such as educational campaigns and legal bans, a multi-pronged behavioral and technological approach addressing the above risk factors will be necessary to reduce this dangerous behavior in adolescents.

  16. Adolescent Cellphone Use While Driving: An Overview of the Literature and Promising Future Directions for Prevention

    Science.gov (United States)

    Delgado, M. Kit; Wanner, Kathryn J.; McDonald, Catherine

    2016-01-01

    Motor vehicle crashes are the leading cause of death in adolescents, and drivers aged 16–19 are the most likely to die in distracted driving crashes. This paper provides an overview of the literature on adolescent cellphone use while driving, focusing on the crash risk, incidence, risk factors for engagement, and the effectiveness of current mitigation strategies. We conclude by discussing promising future approaches to prevent crashes related to cellphone use in adolescents. Handheld manipulation of the phone while driving has been shown to have a 3 to 4-fold increased risk of a near crash or crash, and eye glance duration greater than 2 seconds increases crash risk exponentially. Nearly half of U.S. high school students admit to texting while driving in the last month, but the frequency of use according to vehicle speed and high-risk situations remains unknown. Several risk factors are associated with cell phone use while driving including: parental cellphone use while driving, social norms for quick responses to text messages, and higher levels of temporal discounting. Given the limited effectiveness of current mitigation strategies such as educational campaigns and legal bans, a multi-pronged behavioral and technological approach addressing the above risk factors will be necessary to reduce this dangerous behavior in adolescents. PMID:27695663

  17. Analytical Evaluation of Beam Deformation Problem Using Approximate Methods

    DEFF Research Database (Denmark)

    Barari, Amin; Kimiaeifar, A.; Domairry, G.

    2010-01-01

    The beam deformation equation has very wide applications in structural engineering. As a differential equation, it has its own problem concerning existence, uniqueness and methods of solutions. Often, original forms of governing differential equations used in engineering problems are simplified......, and this process produces noise in the obtained answers. This paper deals with the solution of second order of differential equation governing beam deformation using four analytical approximate methods, namely the Perturbation, Homotopy Perturbation Method (HPM), Homotopy Analysis Method (HAM) and Variational...... Iteration Method (VIM). The comparisons of the results reveal that these methods are very effective, convenient and quite accurate for systems of non-linear differential equation....

  18. The use of decision analytic techniques in energy policy decisions

    International Nuclear Information System (INIS)

    Haemaelaeinen, R.P.; Seppaelaeinen, T.O.

    1986-08-01

    The report reviews decision analytic techniques and their applications to energy policy decision making. Decision analysis consists in techniques for structuring the essential elements of a decision problem and mathematical methods for ranking the alternatives from a set of simple judgments. Because modeling subjective judgments is characteristic of decision analysis, the models can incorporate qualitative factors and values, which escape traditional energy modeling. Decision analysis has been applied to choices among energy supply alternatives, siting energy facilities, selecting nuclear waste repositories, selecting research and development projects, risk analysis and prioritizing alternative energy futures. Many applications are done in universities and research institutions, but during the 70's the use of decision analysis has spread both to the public and the private sector. The settings where decision analysis has been applied range from aiding a single decision maker to clarifying opposing points of view. Decision analytic methods have also been linked with energy models. The most valuable result of decision analysis is the clarification of the problem at hand. Political decisions cannot be made solely on the basis of models, but models can be used to gain insight of the decision situation. Models inevitably simplify reality, so they must be regarded only as aids to judgment. So far there has been only one decision analysis of energy policy issues in Finland with actual political decision makers as participants. The experiences of this project and numerous foreign applications do however suggest that the decision analytic approach is useful in energy policy questions. The report presents a number of Finnish energy policy decisions where decision analysis might prove useful. However, the applicability of the methods depends crucially on the actual circumstances at hand

  19. Environmental vulnerability assessment using Grey Analytic Hierarchy Process based model

    International Nuclear Information System (INIS)

    Sahoo, Satiprasad; Dhar, Anirban; Kar, Amlanjyoti

    2016-01-01

    Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, wind speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.

  20. Environmental vulnerability assessment using Grey Analytic Hierarchy Process based model

    Energy Technology Data Exchange (ETDEWEB)

    Sahoo, Satiprasad [School of Water Resources, Indian Institute of Technology Kharagpur (India); Dhar, Anirban, E-mail: anirban.dhar@gmail.com [Department of Civil Engineering, Indian Institute of Technology Kharagpur (India); Kar, Amlanjyoti [Central Ground Water Board, Bhujal Bhawan, Faridabad, Haryana (India)

    2016-01-15

    Environmental management of an area describes a policy for its systematic and sustainable environmental protection. In the present study, regional environmental vulnerability assessment in Hirakud command area of Odisha, India is envisaged based on Grey Analytic Hierarchy Process method (Grey–AHP) using integrated remote sensing (RS) and geographic information system (GIS) techniques. Grey–AHP combines the advantages of classical analytic hierarchy process (AHP) and grey clustering method for accurate estimation of weight coefficients. It is a new method for environmental vulnerability assessment. Environmental vulnerability index (EVI) uses natural, environmental and human impact related factors, e.g., soil, geology, elevation, slope, rainfall, temperature, wind speed, normalized difference vegetation index, drainage density, crop intensity, agricultural DRASTIC value, population density and road density. EVI map has been classified into four environmental vulnerability zones (EVZs) namely: ‘low’, ‘moderate’ ‘high’, and ‘extreme’ encompassing 17.87%, 44.44%, 27.81% and 9.88% of the study area, respectively. EVI map indicates that the northern part of the study area is more vulnerable from an environmental point of view. EVI map shows close correlation with elevation. Effectiveness of the zone classification is evaluated by using grey clustering method. General effectiveness is in between “better” and “common classes”. This analysis demonstrates the potential applicability of the methodology. - Highlights: • Environmental vulnerability zone identification based on Grey Analytic Hierarchy Process (AHP) • The effectiveness evaluation by means of a grey clustering method with support from AHP • Use of grey approach eliminates the excessive dependency on the experience of experts.

  1. Soil Scientific Research Methods Used in Archaeology – Promising Soil Biochemistry: a Mini-review

    Directory of Open Access Journals (Sweden)

    Valerie Vranová

    2015-01-01

    Full Text Available This work seeks to review soil scientific methods that have been used and are still being used in archaeology. This review paper aims at emphasising the importance of soil science practice to archaeology thus adding a scientific analytical nature to the cultural nature of archaeology. Common methods (physical, chemical and biochemical used to analyse archaeological soils and artefacts is touched on and their strengths and shortcomings duly noted to become the base for future research. Furthermore, the authors made emphasis on distinctive excavating/sampling methods, biochemical analyses focused on distinctive features of plough-land and soil organic matter mineralization, Counter Immunoelectrophoresis (CEIP method by the presence of proteins testing, carbon analyses such as carbon-14 dating techniques, soil phosphorus studies and geochemical analyses of hematite Fe2O3 and cinnabaryte HgS contents. It is obvious that, the future of archaeology is in the soil because the soil harbours information of the past hence the synergy between soil and archaeological research has to be strengthened and archaeology made a prime agenda by soil scientists by expanding the analyses scope of total phosphorus extraction and giving attention to soil magnetism.

  2. Empowering Personalized Medicine with Big Data and Semantic Web Technology: Promises, Challenges, and Use Cases.

    Science.gov (United States)

    Panahiazar, Maryam; Taslimitehrani, Vahid; Jadhav, Ashutosh; Pathak, Jyotishman

    2014-10-01

    In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. In this paper, we briefly discuss the nature of big data and the role of semantic web and data analysis for generating "smart data" which offer actionable information that supports better decision for personalized medicine. In our view, the biggest challenge is to create a system that makes big data robust and smart for healthcare providers and patients that can lead to more effective clinical decision-making, improved health outcomes, and ultimately, managing the healthcare costs. We highlight some of the challenges in using big data and propose the need for a semantic data-driven environment to address them. We illustrate our vision with practical use cases, and discuss a path for empowering personalized medicine using big data and semantic web technology.

  3. Identifying promising accessions of cherry tomato: a sensory strategy using consumers and chefs.

    Science.gov (United States)

    Rocha, Mariella C; Deliza, Rosires; Ares, Gastón; Freitas, Daniela De G C; Silva, Aline L S; Carmo, Margarida G F; Abboud, Antonio C S

    2013-06-01

    An increased production of cherry and gourmet tomato cultivars that are harvested at advanced colour stages and sold at a higher price has been observed in the last 10 years. In this context, producers need information on the sensory characteristics of new cultivars and their perception by potential consumers. The aim of the present work was to obtain a sensory characterisation of nine cherry tomato cultivars produced under Brazilian organic cultivation conditions from a chef and consumer perspective. Nine organic cherry tomato genotypes were evaluated by ten chefs using an open-ended question and by 110 consumers using a check-all-that-apply question. Both methodologies provided similar information on the sensory characteristics of the cherry tomato accessions. The superimposed representation of the samples in a multiple factor analysis was similar for consumers' and chefs' descriptions (RV coefficient 0.728), although they used different methodologies. According to both panels, cherry tomatoes were sorted into five groups of samples with similar sensory characteristics. Results from the present work may provide information to help organic producers in the selection of the most promising cultivars for cultivation, taking into account consumers' and chefs' perceptions, as well as in the design of communication and marketing strategies. © 2012 Society of Chemical Industry.

  4. Heat Conduction Analysis Using Semi Analytical Finite Element Method

    International Nuclear Information System (INIS)

    Wargadipura, A. H. S.

    1997-01-01

    Heat conduction problems are very often found in science and engineering fields. It is of accrual importance to determine quantitative descriptions of this important physical phenomena. This paper discusses the development and application of a numerical formulation and computation that can be used to analyze heat conduction problems. The mathematical equation which governs the physical behaviour of heat conduction is in the form of second order partial differential equations. The numerical resolution used in this paper is performed using the finite element method and Fourier series, which is known as semi-analytical finite element methods. The numerical solution results in simultaneous algebraic equations which is solved using the Gauss elimination methodology. The computer implementation is carried out using FORTRAN language. In the final part of the paper, a heat conduction problem in a rectangular plate domain with isothermal boundary conditions in its edge is solved to show the application of the computer program developed and also a comparison with analytical solution is discussed to assess the accuracy of the numerical solution obtained

  5. Selection of power market structure using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Subhes Bhattacharyya; Prasanta Kumar Dey

    2003-01-01

    Selection of a power market structure from the available alternatives is an important activity within an overall power sector reform program. The evaluation criteria for selection are both subjective as well as objective in nature and the selection of alternatives is characterised by their conflicting nature. This study demonstrates a methodology for power market structure selection using the analytic hierarchy process, a multiple attribute decision- making technique, to model the selection methodology with the active participation of relevant stakeholders in a workshop environment. The methodology is applied to a hypothetical case of a State Electricity Board reform in India. (author)

  6. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  7. PROMISING ACCESSIONS OF CHAENOMELES AND THEIR USE IN THE FUNCTIONAL FOOD

    Directory of Open Access Journals (Sweden)

    V. N. Sorokopudov

    2017-01-01

    Full Text Available A complex analysis is crucial for obtaining new resistant varieties and developing recommendations for the use of the fruit of Chaenomeles. The task of this study was to assess the productivity and quality of the fruit of the selected forms of Chaenomeles in Central Russia, with the determination of the possibility of a no-waste technology for fruit processing, and the appropriateness of using functional food in the composition of products. The studies were conducted in 2012-2016. In the Botanical Gardens of the National Research University "BelGU" (Belgorod, in the FGBNU VSTISP and GBS N.V. Tsitsina. As materials for the study, 6 selective forms of Chaenomeles, obtained from free pollination of the ‘Calif’ variety, used as a control, were used. The study was carried out according to the generally accepted methodology of varietal studies, along with using the authors' methodical development. A sufficiently high nutritional and biological value of the chelating of the Chaenomeles fruit has been observed. At the same time, the minerals, carbohydrates and vitamins from entire fruits exceed the content of ones squeezed from the pulp. The obtained results of the studies allow us to conclude that it is advisable to organize a non-waste technology for the processing of fruit of Chaenomeles, which can serve as one of the components for the enrichment of food products. Thus, a comprehensive assessment of the biological properties and productivity of breeding forms of Chaenomeles has been made showing that they exceeded the parent variety in stability are regarded as promising intense-vitamin fruit culture, that can be used for various processing methods, as part of functional and therapeutic product-prophylactic nutrition, especially in obtaining natural low-calorie foods.

  8. Optimizing multi-pinhole SPECT geometries using an analytical model

    International Nuclear Information System (INIS)

    Rentmeester, M C M; Have, F van der; Beekman, F J

    2007-01-01

    State-of-the-art multi-pinhole SPECT devices allow for sub-mm resolution imaging of radio-molecule distributions in small laboratory animals. The optimization of multi-pinhole and detector geometries using simulations based on ray-tracing or Monte Carlo algorithms is time-consuming, particularly because many system parameters need to be varied. As an efficient alternative we develop a continuous analytical model of a pinhole SPECT system with a stationary detector set-up, which we apply to focused imaging of a mouse. The model assumes that the multi-pinhole collimator and the detector both have the shape of a spherical layer, and uses analytical expressions for effective pinhole diameters, sensitivity and spatial resolution. For fixed fields-of-view, a pinhole-diameter adapting feedback loop allows for the comparison of the system resolution of different systems at equal system sensitivity, and vice versa. The model predicts that (i) for optimal resolution or sensitivity the collimator layer with pinholes should be placed as closely as possible around the animal given a fixed detector layer, (ii) with high-resolution detectors a resolution improvement up to 31% can be achieved compared to optimized systems, (iii) high-resolution detectors can be placed close to the collimator without significant resolution losses, (iv) interestingly, systems with a physical pinhole diameter of 0 mm can have an excellent resolution when high-resolution detectors are used

  9. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    Science.gov (United States)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  10. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    Science.gov (United States)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  11. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Surgical trauma induces overgrowth in lower limb gigantism: regulation with use of rapamycin is promising.

    Science.gov (United States)

    Pinto, Rohan Sebastian; Harrison, William David; Graham, Kenneth; Nayagam, Durai

    2018-01-04

    We describe an unclassified overgrowth syndrome characterised by unregulated growth of dermal fibroblasts in the lower limbs of a 35-year-old woman. A PIK3CA gene mutation resulted in lower limb gigantism. Below the waist, she weighed 117 kg with each leg measuring over 100 cm in circumference. Her total adiposity was 50% accounted for by her legs mainly. Liposuction and surgical debulking were performed to reduce the size of the limbs but had exacerbated the overgrowth in her lower limbs. Systemic sepsis from an infected foot ulcer necessitated treatment by an above-knee amputation. Postoperatively, the stump increased in size by 19 kg. A trial of rapamycin to reverse the growth of the stump has shown promise. We discuss the clinical and genetic features of this previously unclassified disorder and the orthopaedic considerations involved. © BMJ Publishing Group Ltd (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Multiattribute Supplier Selection Using Fuzzy Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Serhat Aydin

    2010-11-01

    Full Text Available Supplier selection is a multiattribute decision making (MADM problem which contains both qualitative and quantitative factors. Supplier selection has vital importance for most companies. The aim of this paper is to provide an AHP based analytical tool for decision support enabling an effective multicriteria supplier selection process in an air conditioner seller firm under fuzziness. In this article, the Analytic Hierarchy Process (AHP under fuzziness is employed for its permissiveness to use an evaluation scale including linguistic expressions, crisp numerical values, fuzzy numbers and range numerical values. This scale provides a more flexible evaluation compared with the other fuzzy AHP methods. In this study, the modified AHP was used in supplier selection in an air conditioner firm. Three experts evaluated the suppliers according to the proposed model and the most appropriate supplier was selected. The proposed model enables decision makers select the best supplier among supplier firms effectively. We confirm that the modified fuzzy AHP is appropriate for group decision making in supplier selection problems.

  14. Using Big Data Analytics to Advance Precision Radiation Oncology.

    Science.gov (United States)

    McNutt, Todd R; Benedict, Stanley H; Low, Daniel A; Moore, Kevin; Shpitser, Ilya; Jiang, Wei; Lakshminarayanan, Pranav; Cheng, Zhi; Han, Peijin; Hui, Xuan; Nakatsugawa, Minoru; Lee, Junghoon; Moore, Joseph A; Robertson, Scott P; Shah, Veeraj; Taylor, Russ; Quon, Harry; Wong, John; DeWeese, Theodore

    2018-06-01

    Big clinical data analytics as a primary component of precision medicine is discussed, identifying where these emerging tools fit in the spectrum of genomics and radiomics research. A learning health system (LHS) is conceptualized that uses clinically acquired data with machine learning to advance the initiatives of precision medicine. The LHS is comprehensive and can be used for clinical decision support, discovery, and hypothesis derivation. These developing uses can positively impact the ultimate management and therapeutic course for patients. The conceptual model for each use of clinical data, however, is different, and an overview of the implications is discussed. With advancements in technologies and culture to improve the efficiency, accuracy, and breadth of measurements of the patient condition, the concept of an LHS may be realized in precision radiation therapy. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Transition Analysis of Promising U.S. Future Fuel Cycles Using ORION

    International Nuclear Information System (INIS)

    Sunny, Eva E.; Worrall, Andrew; Peterson, Joshua L.; Powers, Jeffrey J.; Gehin, Jess C.; Gregg, Robert

    2015-01-01

    The US Department of Energy Office of Fuel Cycle Technologies performed an evaluation and screening (E&S) study of nuclear fuel cycle options to help prioritize future research and development decisions. Previous work for this E&S study focused on establishing equilibrium conditions for analysis examples of 40 nuclear fuel cycle evaluation groups (EGs) and evaluating their performance according to a set of 22 standardized metrics. Following the E&S study, additional studies are being conducted to assess transitioning from the current US fuel cycle to future fuel cycle options identified by the E&S study as being most promising. These studies help inform decisions on how to effectively achieve full transition, estimate the length of time needed to undergo transition from the current fuel cycle, and evaluate performance of nuclear systems and facilities in place during the transition. These studies also help identify any barriers to achieve transition. Oak Ridge National Laboratory (ORNL) Fuel Cycle Options Campaign team used ORION to analyze the transition pathway from the existing US nuclear fuel cycle—the once-through use of low-enriched-uranium (LEU) fuel in thermal-spectrum light water reactors (LWRs)—to a new fuel cycle with continuous recycling of plutonium and uranium in sodium fast reactors (SFRs). This paper discusses the analysis of the transition from an LWR to an SFR fleet using ORION, highlights the role of lifetime extensions of existing LWRs to aid transition, and discusses how a slight delay in SFR deployment can actually reduce the time to achieve an equilibrium fuel cycle.

  16. Transition analysis of promising U.S. future fuel cycles using ORION - 5114

    International Nuclear Information System (INIS)

    Sunny, E.; Worrall, A.; Peterson, J.; Powers, J.; Gehin, J.

    2015-01-01

    The US Department of Energy Office of Fuel Cycle Technologies performed an evaluation and screening (E/S) study of nuclear fuel cycle options to help prioritize future research and development decisions. Previous work for this E/S study focused on establishing equilibrium conditions for analysis examples of 40 nuclear fuel cycle evaluation groups and evaluating their performance according to a set of 22 standardized metrics. Following the E/S study, additional studies are being conducted to assess transition period from the current US fuel cycle to future fuel cycle options identified by the E/S study as being most promising. These studies help inform decisions on how to effectively achieve full transition, estimate the length of time needed to undergo transition from the current fuel cycle, and evaluate performance of nuclear systems and facilities in place during the transition. These studies also help identify any barriers to achieve transition. Oak Ridge National Laboratory (ORNL) Fuel Cycle Options Campaign team used ORION to analyze the transition pathway from the existing US nuclear fuel cycle - the once-through use of low-enriched-uranium (LEU) fuel in thermal-spectrum light water reactors (LWRs) - to a new fuel cycle with continuous recycling of plutonium and uranium in sodium fast reactors (SFRs). This paper discusses the analysis of the transition from an LWR to an SFR fleet using ORION, highlights the role of lifetime extensions of existing LWRs to aid transition, and discusses how a slight delay in SFR deployment can actually reduce the time to achieve an equilibrium fuel cycle. (authors)

  17. Many-core graph analytics using accelerated sparse linear algebra routines

    Science.gov (United States)

    Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric

    2016-05-01

    Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.

  18. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  19. Evaluating supplier quality performance using fuzzy analytical hierarchy process

    Science.gov (United States)

    Ahmad, Nazihah; Kasim, Maznah Mat; Rajoo, Shanmugam Sundram Kalimuthu

    2014-12-01

    Evaluating supplier quality performance is vital in ensuring continuous supply chain improvement, reducing the operational costs and risks towards meeting customer's expectation. This paper aims to illustrate an application of Fuzzy Analytical Hierarchy Process to prioritize the evaluation criteria in a context of automotive manufacturing in Malaysia. Five main criteria were identified which were quality, cost, delivery, customer serviceand technology support. These criteria had been arranged into hierarchical structure and evaluated by an expert. The relative importance of each criteria was determined by using linguistic variables which were represented as triangular fuzzy numbers. The Center of Gravity defuzzification method was used to convert the fuzzy evaluations into their corresponding crisps values. Such fuzzy evaluation can be used as a systematic tool to overcome the uncertainty evaluation of suppliers' performance which usually associated with human being subjective judgments.

  20. Spectral interference of zirconium on 24 analyte elements using CCD based ICP-AES technique

    International Nuclear Information System (INIS)

    Adya, V.C.; Sengupta, Arijit; Godbole, S.V.

    2014-01-01

    In the present studies, the spectral interference of zirconium on different analytical lines of 24 critical analytes using CCD based ICP-AES technique is described. Suitable analytical lines for zirconium were identified along with their detection limits. The sensitivity and the detection limits of analytical channels for different elements in presence of Zr matrix were calculated. Subsequently analytical lines with least interference from Zr and better detection limits were selected for their determinations. (author)

  1. Promising Practices in Higher Education: Art Education and Human Rights Using Information, Communication Technologies (ICT)

    Science.gov (United States)

    Black, Joanna; Cap, Orest

    2014-01-01

    Promising pedagogical practices is described in relation to incorporating ICT (Information, Communication and Technologies) with the study of Human Rights issues in Visual Arts Education for teacher candidates. As part of a course, "Senior Years Art," students at the Faculty of Education, University of Manitoba during 2013-2014…

  2. Pellet manufacturing by extrusion-spheronization using process analytical technology

    DEFF Research Database (Denmark)

    Sandler, Niklas; Rantanen, Jukka; Heinämäki, Jyrki

    2005-01-01

    The aim of this study was to investigate the phase transitions occurring in nitrofurantoin and theophylline formulations during pelletization by extrusion-spheronization. An at-line process analytical technology (PAT) approach was used to increase the understanding of the solid-state behavior...... of the active pharmaceutical ingredients (APIs) during pelletization. Raman spectroscopy, near-infrared (NIR) spectroscopy, and X-ray powder diffraction (XRPD) were used in the characterization of polymorphic changes during the process. Samples were collected at the end of each processing stage (blending......, granulation, extrusion, spheronization, and drying). Batches were dried at 3 temperature levels (60 degrees C, 100 degrees C, and 135 degrees C). Water induced a hydrate formation in both model formulations during processing. NIR spectroscopy gave valuable real-time data about the state of water in the system...

  3. Optimizing an immersion ESL curriculum using analytic hierarchy process.

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-11-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  5. Using Learning Analytics to Understand Scientific Modeling in the Classroom

    Directory of Open Access Journals (Sweden)

    David Quigley

    2017-11-01

    Full Text Available Scientific models represent ideas, processes, and phenomena by describing important components, characteristics, and interactions. Models are constructed across various scientific disciplines, such as the food web in biology, the water cycle in Earth science, or the structure of the solar system in astronomy. Models are central for scientists to understand phenomena, construct explanations, and communicate theories. Constructing and using models to explain scientific phenomena is also an essential practice in contemporary science classrooms. Our research explores new techniques for understanding scientific modeling and engagement with modeling practices. We work with students in secondary biology classrooms as they use a web-based software tool—EcoSurvey—to characterize organisms and their interrelationships found in their local ecosystem. We use learning analytics and machine learning techniques to answer the following questions: (1 How can we automatically measure the extent to which students’ scientific models support complete explanations of phenomena? (2 How does the design of student modeling tools influence the complexity and completeness of students’ models? (3 How do clickstreams reflect and differentiate student engagement with modeling practices? We analyzed EcoSurvey usage data collected from two different deployments with over 1,000 secondary students across a large urban school district. We observe large variations in the completeness and complexity of student models, and large variations in their iterative refinement processes. These differences reveal that certain key model features are highly predictive of other aspects of the model. We also observe large differences in student modeling practices across different classrooms and teachers. We can predict a student’s teacher based on the observed modeling practices with a high degree of accuracy without significant tuning of the predictive model. These results highlight

  6. Special concrete shield selection using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Abulfaraj, W.H.

    1994-01-01

    Special types of concrete radiation shields that depend on locally available materials and have improved properties for both neutron and gamma-ray attenuation were developed by using plastic materials and heavy ores. The analytic hierarchy process (AHP) is implemented to evaluate these types for selecting the best biological radiation shield for nuclear reactors. Factors affecting the selection decision are degree of protection against neutrons, degree of protection against gamma rays, suitability of the concrete as building material, and economic considerations. The seven concrete alternatives are barite-polyethylene concrete, barite-polyvinyl chloride (PVC) concrete, barite-portland cement concrete, pyrite-polyethylene concrete, pyrite-PVC concrete, pyrite-portland cement concrete, and ordinary concrete. The AHP analysis shows the superiority of pyrite-polyethylene concrete over the others

  7. Using Visual Analytics to Maintain Situation Awareness in Astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Cecilia R.; Poon, Sarah S.; Aldering, Gregory S.; Thomas, Rollin C.; Quimby, Robert

    2008-07-01

    We present a novel collaborative visual analytics application for cognitively overloaded users in the astrophysics domain. The system was developed for scientists needing to analyze heterogeneous, complex data under time pressure, and then make predictions and time-critical decisions rapidly and correctly under a constant influx of changing data. The Sunfall Data Taking system utilizes severalnovel visualization and analysis techniques to enable a team of geographically distributed domain specialists to effectively and remotely maneuver a custom-built instrument under challenging operational conditions. Sunfall Data Taking has been in use for over eighteen months by a major international astrophysics collaboration (the largest data volume supernova search currently in operation), and has substantially improved the operational efficiency of its users. We describe the system design process by an interdisciplinary team, the system architecture, and the results of an informal usability evaluation of the production system by domain experts in the context of Endsley?s three levels of situation awareness.

  8. Developing an Emergency Physician Productivity Index Using Descriptive Health Analytics.

    Science.gov (United States)

    Khalifa, Mohamed

    2015-01-01

    Emergency department (ED) crowding became a major barrier to receiving timely emergency care. At King Faisal Specialist Hospital and Research Center, Saudi Arabia, we identified variables and factors affecting crowding and performance to develop indicators to help evaluation and improvement. Measuring efficiency of work and activity of throughput processes; it was important to develop an ED physician productivity index. Data on all ED patients' encounters over the last six months of 2014 were retrieved and descriptive health analytics methods were used. Three variables were identified for their influence on productivity and performance; Number of Treated Patients per Physician, Patient Acuity Level and Treatment Time. The study suggested a formula to calculate the productivity index of each physician through dividing the Number of Treated Patients by Patient Acuity Level squared and Treatment Time to identify physicians with low productivity index and investigate causes and factors.

  9. Analytical methods used in plutonium purification cycles by trilaurylamine

    International Nuclear Information System (INIS)

    Perez, J.J.

    1965-01-01

    The utilisation of trilaurylamine as a solvent extractant for the purification of plutonium has entailed to perfect a set of analytical methods which involves, various techniques. The organic impurities of the solvent can be titrated by gas-liquid chromatography. The titration of the main degradation product, the di-laurylamine, can be accomplished also by spectro-colorimetry. Potentiometry is used for the analysis of the different salts of amine-nitrate-sulfate-bisulfate as also the extracted nitric acid. The determination of the nitrate in aqueous phase is carried out by constant current potentiometry. The range of application, the accuracy and the procedure of these analysis are related in the present report. (author) [fr

  10. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  11. Using analytic continuation for the hadronic vacuum polarization computation

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Xu; Hashimoto, Shoji; Hotzel, Grit; Jansen, Karl; Petschlies, Marcus; B, Renner Dru

    2014-11-01

    We present two examples of applications of the analytic continuation method for computing the hadronic vacuum polarization function in space- and time-like momentum regions. These examples are the Adler function and the leading order hadronic contribution to the muon anomalous magnetic moment. We comment on the feasibility of the analytic continuation method and provide an outlook for possible further applications.

  12. Broadening the Scope and Increasing the Usefulness of Learning Analytics:

    Science.gov (United States)

    Ellis, Cath

    2013-01-01

    Learning analytics is a relatively new field of inquiry and its precise meaning is both contested and fluid (Johnson, Smith, Willis, Levine & Haywood, 2011; LAK, n.d.). Ferguson (2012) suggests that the best working definition is that offered by the first Learning Analytics and Knowledge (LAK) conference: "the measurement, collection,…

  13. Use of telemedicine-based care for the aging and elderly: promises and pitfalls

    Directory of Open Access Journals (Sweden)

    Bujnowska-Fedak MM

    2015-05-01

    Full Text Available Maria Magdalena Bujnowska-Fedak, Urszula Grata-Borkowska Department of Family Medicine, Wroclaw Medical University, Wroclaw, Poland Abstract: Telemedicine-based care provides remote health and social care to maintain people's autonomy and increase their quality of life. The rapidly aging population has come with a significant increase in the prevalence of chronic diseases and their effects, and thus the need for increased care and welfare. The elderly have become one of the main target groups for telecare technologies. Smart home systems allow older adults to live in the environment of their choice and protect them against institutionalization or placement in a nursing home. It gives the elderly person a feeling of reassurance and safety, and appears to be one of the most promising approaches to facilitate independent living in a community-dwelling situation. Telecare solutions give a new opportunity for diagnosis, treatment, education, and rehabilitation, and make it possible to monitor patients with a number of chronic diseases. It also reduces socioeconomic disparity with regard to access to care and gives equal chances to patients from urban and rural areas. However, although telecare has undisputed benefits, it also has some limitations. Older people are often resistant to use of new technology, in particular acquiring the knowledge and skills necessary for use of electronic devices and computer systems. Further, privacy and security are important elements when building confidence in telemedicine systems. Leaking of sensitive information, such as health or test results, may have a negative and far-reaching impact on the personal and professional life of the patient. Telemedicine-based care should now be personalized for the needs, capabilities, and preferences of the elderly, with adaptation over time as care needs evolve. If technologies are introduced that are familiar, usable, desirable, and cost-effective, and able to be adapted to

  14. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  15. Trace detection of analytes using portable raman systems

    Science.gov (United States)

    Alam, M. Kathleen; Hotchkiss, Peter J.; Martin, Laura E.; Jones, David Alexander

    2015-11-24

    Apparatuses and methods for in situ detection of a trace amount of an analyte are disclosed herein. In a general embodiment, the present disclosure provides a surface-enhanced Raman spectroscopy (SERS) insert including a passageway therethrough, where the passageway has a SERS surface positioned therein. The SERS surface is configured to adsorb molecules of an analyte of interest. A concentrated sample is caused to flow over the SERS surface. The SERS insert is then provided to a portable Raman spectroscopy system, where it is analyzed for the analyte of interest.

  16. Analytic treatment of nonlinear evolution equations using first ...

    Indian Academy of Sciences (India)

    1. — journal of. July 2012 physics pp. 3–17. Analytic treatment of nonlinear evolution ... Eskisehir Osmangazi University, Art-Science Faculty, Department of Mathematics, ... (2.2) is integrated where integration constants are considered zeros.

  17. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim; Muller, Hendrik; Adam, Frederick M.; Panda, Saroj K.; Witt, Matthias; Al-Hajji, Adnan A.; Sarathy, Mani

    2015-01-01

    cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated

  18. An analytical study of various telecomminication networks using Markov models

    International Nuclear Information System (INIS)

    Ramakrishnan, M; Jayamani, E; Ezhumalai, P

    2015-01-01

    The main aim of this paper is to examine issues relating to the performance of various Telecommunication networks, and applied queuing theory for better design and improved efficiency. Firstly, giving an analytical study of queues deals with quantifying the phenomenon of waiting lines using representative measures of performances, such as average queue length (on average number of customers in the queue), average waiting time in queue (on average time to wait) and average facility utilization (proportion of time the service facility is in use). In the second, using Matlab simulator, summarizes the finding of the investigations, from which and where we obtain results and describing methodology for a) compare the waiting time and average number of messages in the queue in M/M/1 and M/M/2 queues b) Compare the performance of M/M/1 and M/D/1 queues and study the effect of increasing the number of servers on the blocking probability M/M/k/k queue model. (paper)

  19. Data analytics using canonical correlation analysis and Monte Carlo simulation

    Science.gov (United States)

    Rickman, Jeffrey M.; Wang, Yan; Rollett, Anthony D.; Harmer, Martin P.; Compson, Charles

    2017-07-01

    A canonical correlation analysis is a generic parametric model used in the statistical analysis of data involving interrelated or interdependent input and output variables. It is especially useful in data analytics as a dimensional reduction strategy that simplifies a complex, multidimensional parameter space by identifying a relatively few combinations of variables that are maximally correlated. One shortcoming of the canonical correlation analysis, however, is that it provides only a linear combination of variables that maximizes these correlations. With this in mind, we describe here a versatile, Monte-Carlo based methodology that is useful in identifying non-linear functions of the variables that lead to strong input/output correlations. We demonstrate that our approach leads to a substantial enhancement of correlations, as illustrated by two experimental applications of substantial interest to the materials science community, namely: (1) determining the interdependence of processing and microstructural variables associated with doped polycrystalline aluminas, and (2) relating microstructural decriptors to the electrical and optoelectronic properties of thin-film solar cells based on CuInSe2 absorbers. Finally, we describe how this approach facilitates experimental planning and process control.

  20. Supercritical boiler material selection using fuzzy analytic network process

    Directory of Open Access Journals (Sweden)

    Saikat Ranjan Maity

    2012-08-01

    Full Text Available The recent development of world is being adversely affected by the scarcity of power and energy. To survive in the next generation, it is thus necessary to explore the non-conventional energy sources and efficiently consume the available sources. For efficient exploitation of the existing energy sources, a great scope lies in the use of Rankin cycle-based thermal power plants. Today, the gross efficiency of Rankin cycle-based thermal power plants is less than 28% which has been increased up to 40% with reheating and regenerative cycles. But, it can be further improved up to 47% by using supercritical power plant technology. Supercritical power plants use supercritical boilers which are able to withstand a very high temperature (650-720˚C and pressure (22.1 MPa while producing superheated steam. The thermal efficiency of a supercritical boiler greatly depends on the material of its different components. The supercritical boiler material should possess high creep rupture strength, high thermal conductivity, low thermal expansion, high specific heat and very high temperature withstandability. This paper considers a list of seven supercritical boiler materials whose performance is evaluated based on seven pivotal criteria. Given the intricacy and difficulty of this supercritical boiler material selection problem having interactions and interdependencies between different criteria, this paper applies fuzzy analytic network process to select the most appropriate material for a supercritical boiler. Rene 41 is the best supercritical boiler material, whereas, Haynes 230 is the worst preferred choice.

  1. Evaluating supplier quality performance using analytical hierarchy process

    Science.gov (United States)

    Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah

    2013-09-01

    This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.

  2. Using online analytical processing to manage emergency department operations.

    Science.gov (United States)

    Gordon, Bradley D; Asplin, Brent R

    2004-11-01

    The emergency department (ED) is a unique setting in which to explore and evaluate the utility of information technology to improve health care operations. A potentially useful software tool in managing this complex environment is online analytical processing (OLAP). An OLAP system has the ability to provide managers, providers, and researchers with the necessary information to make decisions quickly and effectively by allowing them to examine patterns and trends in operations and patient flow. OLAP software quickly summarizes and processes data acquired from a variety of data sources, including computerized ED tracking systems. It allows the user to form a comprehensive picture of the ED from both system-wide and patient-specific perspectives and to interactively view the data using an approach that meets his or her needs. This article describes OLAP software tools and provides examples of potential OLAP applications for care improvement projects, primarily from the perspective of the ED. While OLAP is clearly a helpful tool in the ED, it is far more useful when integrated into the larger continuum of health information systems across a hospital or health care delivery system.

  3. USING ANALYTIC HIERARCHY PROCESS (AHP METHOD IN RURAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Tülay Cengiz

    2003-04-01

    Full Text Available Rural development is a body of economical and social policies towards improving living conditions in rural areas through enabling rural population to utilize economical, social, cultural and technological blessing of city life in place, without migrating. As it is understood from this description, rural development is a very broad concept. Therefore, in development efforts problem should be stated clearly, analyzed and many criterias should be evaluated by experts. Analytic Hierarchy Process (AHP method can be utilized at there stages of development efforts. AHP methods is one of multi-criteria decision method. After degrading a problem in smaller pieces, relative importance and level of importance of two compared elements are determined. It allows evaluation of quality and quantity factors. At the same time, it permits utilization of ideas of many experts and use them in decision process. Because mentioned features of AHP method, it could be used in rural development works. In this article, cultural factors, one of the important components of rural development is often ignored in many studies, were evaluated as an example. As a result of these applications and evaluations, it is concluded that AHP method could be helpful in rural development efforts.

  4. Measurement of company effectiveness using analytic network process method

    Directory of Open Access Journals (Sweden)

    Goran Janjić

    2017-07-01

    Full Text Available The sustainable development of an organisation is monitored through the organisation’s performance, which beforehand incorporates all stakeholders’ requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP to define the weight factors of the mutual influences of all the important elements of an organisation’s strategy. The calculation of an organisation’s effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation’s business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation’s most important measures.

  5. Measurement of company effectiveness using analytic network process method

    Science.gov (United States)

    Goran, Janjić; Zorana, Tanasić; Borut, Kosec

    2017-07-01

    The sustainable development of an organisation is monitored through the organisation's performance, which beforehand incorporates all stakeholders' requirements in its strategy. The strategic management concept enables organisations to monitor and evaluate their effectiveness along with efficiency by monitoring of the implementation of set strategic goals. In the process of monitoring and measuring effectiveness, an organisation can use multiple-criteria decision-making methods as help. This study uses the method of analytic network process (ANP) to define the weight factors of the mutual influences of all the important elements of an organisation's strategy. The calculation of an organisation's effectiveness is based on the weight factors and the degree of fulfilment of the goal values of the strategic map measures. New business conditions influence the changes in the importance of certain elements of an organisation's business in relation to competitive advantage on the market, and on the market, increasing emphasis is given to non-material resources in the process of selection of the organisation's most important measures.

  6. Using Google Analytics to evaluate the impact of the CyberTraining project.

    Science.gov (United States)

    McGuckin, Conor; Crowley, Niall

    2012-11-01

    A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.

  7. Bio-analytical applications of mid-infrared spectroscopy using silver halide fiber-optic probes

    International Nuclear Information System (INIS)

    Heise, H.M.; Kuepper, L.; Butvina, L.N.

    2002-01-01

    Infrared-spectroscopy has proved to be a powerful method for the study of various biomedical samples, in particular for in-vitro analysis in the clinical laboratory and for non-invasive diagnostics. In general, the analysis of biofluids such as whole blood, urine, microdialysates and bioreactor broth media takes advantage of the fact that a multitude of analytes can be quantified simultaneously and rapidly without the need for reagents. Progress in the quality of infrared silver halide fibers enabled us to construct several flexible fiber-optic probes of different geometries, which are particularly suitable for the measurement of small biosamples. Recent trends show that dry film measurements by mid-infrared spectroscopy could revolutionize analytical tools in the clinical chemistry laboratory, and an example is given. Infrared diagnostic tools show a promising potential for patients, and minimal-invasive blood glucose assays or skin tissue pathology in particular cannot be left out using mid-infrared fiber-based probes. Other applications include the measurement of skin samples including penetration studies of vitamins and constituents of cosmetic cream formulations. A further field is the micro-domain analysis of biopsy samples from bog mummified corpses, and recent results on the chemistry of dermis and hair samples are reported. Another field of application, for which results are reported, is food analysis and bio-reactor monitoring

  8. Behavior-Based Budget Management Using Predictive Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Troy Hiltbrand

    2013-03-01

    Historically, the mechanisms to perform forecasting have primarily used two common factors as a basis for future predictions: time and money. While time and money are very important aspects of determining future budgetary spend patterns, organizations represent a complex system of unique individuals with a myriad of associated behaviors and all of these behaviors have bearing on how budget is utilized. When looking to forecasted budgets, it becomes a guessing game about how budget managers will behave under a given set of conditions. This becomes relatively messy when human nature is introduced, as different managers will react very differently under similar circumstances. While one manager becomes ultra conservative during periods of financial austerity, another might be un-phased and continue to spend as they have in the past. Both might revert into a state of budgetary protectionism masking what is truly happening at a budget holder level, in order to keep as much budget and influence as possible while at the same time sacrificing the greater good of the organization. To more accurately predict future outcomes, the models should consider both time and money and other behavioral patterns that have been observed across the organization. The field of predictive analytics is poised to provide the tools and methodologies needed for organizations to do just this: capture and leverage behaviors of the past to predict the future.

  9. Real-time analysis of healthcare using big data analytics

    Science.gov (United States)

    Basco, J. Antony; Senthilkumar, N. C.

    2017-11-01

    Big Data Analytics (BDA) provides a tremendous advantage where there is a need of revolutionary performance in handling large amount of data that covers 4 characteristics such as Volume Velocity Variety Veracity. BDA has the ability to handle such dynamic data providing functioning effectiveness and exceptionally beneficial output in several day to day applications for various organizations. Healthcare is one of the sectors which generate data constantly covering all four characteristics with outstanding growth. There are several challenges in processing patient records which deals with variety of structured and unstructured format. Inducing BDA in to Healthcare (HBDA) will deal with sensitive patient driven information mostly in unstructured format comprising of prescriptions, reports, data from imaging system, etc., the challenges will be overcome by big data with enhanced efficiency in fetching and storing of data. In this project, dataset alike Electronic Medical Records (EMR) produced from numerous medical devices and mobile applications will be induced into MongoDB using Hadoop framework with Improvised processing technique to improve outcome of processing patient records.

  10. Podium: Ranking Data Using Mixed-Initiative Visual Analytics.

    Science.gov (United States)

    Wall, Emily; Das, Subhajit; Chawla, Ravish; Kalidindi, Bharath; Brown, Eli T; Endert, Alex

    2018-01-01

    People often rank and order data points as a vital part of making decisions. Multi-attribute ranking systems are a common tool used to make these data-driven decisions. Such systems often take the form of a table-based visualization in which users assign weights to the attributes representing the quantifiable importance of each attribute to a decision, which the system then uses to compute a ranking of the data. However, these systems assume that users are able to quantify their conceptual understanding of how important particular attributes are to a decision. This is not always easy or even possible for users to do. Rather, people often have a more holistic understanding of the data. They form opinions that data point A is better than data point B but do not necessarily know which attributes are important. To address these challenges, we present a visual analytic application to help people rank multi-variate data points. We developed a prototype system, Podium, that allows users to drag rows in the table to rank order data points based on their perception of the relative value of the data. Podium then infers a weighting model using Ranking SVM that satisfies the user's data preferences as closely as possible. Whereas past systems help users understand the relationships between data points based on changes to attribute weights, our approach helps users to understand the attributes that might inform their understanding of the data. We present two usage scenarios to describe some of the potential uses of our proposed technique: (1) understanding which attributes contribute to a user's subjective preferences for data, and (2) deconstructing attributes of importance for existing rankings. Our proposed approach makes powerful machine learning techniques more usable to those who may not have expertise in these areas.

  11. A Framework for Learning Analytics Using Commodity Wearable Devices.

    Science.gov (United States)

    Lu, Yu; Zhang, Sen; Zhang, Zhiqiang; Xiao, Wendong; Yu, Shengquan

    2017-06-14

    We advocate for and introduce LEARNSense, a framework for learning analytics using commodity wearable devices to capture learner's physical actions and accordingly infer learner context (e.g., student activities and engagement status in class). Our work is motivated by the observations that: (a) the fine-grained individual-specific learner actions are crucial to understand learners and their context information; (b) sensor data available on the latest wearable devices (e.g., wrist-worn and eye wear devices) can effectively recognize learner actions and help to infer learner context information; (c) the commodity wearable devices that are widely available on the market can provide a hassle-free and non-intrusive solution. Following the above observations and under the proposed framework, we design and implement a sensor-based learner context collector running on the wearable devices. The latest data mining and sensor data processing techniques are employed to detect different types of learner actions and context information. Furthermore, we detail all of the above efforts by offering a novel and exemplary use case: it successfully provides the accurate detection of student actions and infers the student engagement states in class. The specifically designed learner context collector has been implemented on the commodity wrist-worn device. Based on the collected and inferred learner information, the novel intervention and incentivizing feedback are introduced into the system service. Finally, a comprehensive evaluation with the real-world experiments, surveys and interviews demonstrates the effectiveness and impact of the proposed framework and this use case. The F1 score for the student action classification tasks achieve 0.9, and the system can effectively differentiate the defined three learner states. Finally, the survey results show that the learners are satisfied with the use of our system (mean score of 3.7 with a standard deviation of 0.55).

  12. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  13. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  14. Detection of sensor failures in nuclear plants using analytic redundancy

    International Nuclear Information System (INIS)

    Kitamura, M.

    1980-01-01

    A method for on-line, nonperturbative detection and identification of sensor failures in nuclear power plants was studied to determine its feasibility. This method is called analytic redundancy, or functional redundancy. Sensor failure has traditionally been detected by comparing multiple signals from redundant sensors, such as in two-out-of-three logic. In analytic redundancy, with the help of an assumed model of the physical system, the signals from a set of sensors are processed to reproduce the signals from all system sensors

  15. Using countertransference: analytic contact, projective identification, and transference phantasy states.

    Science.gov (United States)

    Waska, Robert

    2008-01-01

    The influence of projective identification is an integral aspect of most psychoanalytic treatments, not only with patients who are more disturbed, but also with individuals are higher functioning and have neuroses. Projective identification involves both internal relational phantasies of self and object as well as external interactions with the environment. Both elements shape the transference. Continuous projections distort the ego's image of the object, causing introjections that bring increased guilt, anxiety, and envy onto the ego, creating even more radical projections. Consequently, the countertransference is repeatedly stimulated in an evolving or devolving manner (Clarkin, Yeomans, Kernberg, 2006). The case material has illustrated the constant interplay among projective identification, transference, and countertransference as well as the utility of countertransference in making the most helpful interpretations. The concept of analytic contact (Waska, 2006; Waska 2007) was noted as the vehicle of optimal psychological transformation. Rather than an emphasis on frequency, diagnosis, use of couch, or mode of termination, the focus is more on the clinical situation and the moment-to-moment work on internal conflict, unconscious phantasy, destructive defenses, analysis of the transference and extratransference anxieties, and the gradual integration of core object relational experiences. Regarding a more clinical rather than theoretical definition of psychoanalysis, Sandler (1988) states that what truly defines a treatment as psychoanalytic is the analyst's attitudes towards his patient, his willingness to contain and make the effort to patiently understand the patient's unconscious conflicts and reactions to internal phantasy states, the humane detachment and lack of judgment, and the maintenance of a comfortable and safe setting in which the transference can unfold. This definition is certainly similar to the elements of analytic contact. Use of the

  16. Promising Biomolecules.

    Science.gov (United States)

    Oliveira, Isabel; Carvalho, Ana L; Radhouani, Hajer; Gonçalves, Cristiana; Oliveira, J Miguel; Reis, Rui L

    2018-01-01

    The osteochondral defect (OD) comprises the articular cartilage and its subchondral bone. The treatment of these lesions remains as one of the most problematic clinical issues, since these defects include different tissues, requiring distinct healing approaches. Among the growing applications of regenerative medicine, clinical articular cartilage repair has been used for two decades, and it is an effective example of translational medicine; one of the most used cell-based repair strategies includes implantation of autologous cells in degradable scaffolds such as alginate, agarose, collagen, chitosan, chondroitin sulfate, cellulose, silk fibroin, hyaluronic acid, and gelatin, among others. Concerning the repair of osteochondral defects, tissue engineering and regenerative medicine started to design single- or bi-phased scaffold constructs, often containing hydroxyapatite-collagen composites, usually used as a bone substitute. Biomolecules such as natural and synthetic have been explored to recreate the cartilage-bone interface through multilayered biomimetic scaffolds. In this chapter, a succinct description about the most relevant natural and synthetic biomolecules used on cartilage and bone repair, describing the procedures to obtain these biomolecules, their chemical structure, common modifications to improve its characteristics, and also their application in the biomedical fields, is given.

  17. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  18. Using Streaming Analytics for Effective Real Time Network Visibility -

    Science.gov (United States)

    on in your network right now. Certainly the other thing that we talked about on the big data side was [inaudible] data. So now we'll drill into - so this is all the traffic from the internal network to the taking a streaming analytics approach to network traffic analysis. So we can go to the next - there we go

  19. Social Data Analytics Using Tensors and Sparse Techniques

    Science.gov (United States)

    Zhang, Miao

    2014-01-01

    The development of internet and mobile technologies is driving an earthshaking social media revolution. They bring the internet world a huge amount of social media content, such as images, videos, comments, etc. Those massive media content and complicate social structures require the analytic expertise to transform those flood of information into…

  20. Spacecraft formation control using analytical finite-duration approaches

    Science.gov (United States)

    Ben Larbi, Mohamed Khalil; Stoll, Enrico

    2018-03-01

    This paper derives a control concept for formation flight (FF) applications assuming circular reference orbits. The paper focuses on a general impulsive control concept for FF which is then extended to the more realistic case of non-impulsive thrust maneuvers. The control concept uses a description of the FF in relative orbital elements (ROE) instead of the classical Cartesian description since the ROE provide a direct insight into key aspects of the relative motion and are particularly suitable for relative orbit control purposes and collision avoidance analysis. Although Gauss' variational equations have been first derived to offer a mathematical tool for processing orbit perturbations, they are suitable for several different applications. If the perturbation acceleration is due to a control thrust, Gauss' variational equations show the effect of such a control thrust on the Keplerian orbital elements. Integrating the Gauss' variational equations offers a direct relation between velocity increments in the local vertical local horizontal frame and the subsequent change of Keplerian orbital elements. For proximity operations, these equations can be generalized from describing the motion of single spacecraft to the description of the relative motion of two spacecraft. This will be shown for impulsive and finite-duration maneuvers. Based on that, an analytical tool to estimate the error induced through impulsive maneuver planning is presented. The resulting control schemes are simple and effective and thus also suitable for on-board implementation. Simulations show that the proposed concept improves the timing of the thrust maneuver executions and thus reduces the residual error of the formation control.

  1. Online Continuous Trace Process Analytics Using Multiplexing Gas Chromatography.

    Science.gov (United States)

    Wunsch, Marco R; Lehnig, Rudolf; Trapp, Oliver

    2017-04-04

    The analysis of impurities at a trace level in chemical products, nutrition additives, and drugs is highly important to guarantee safe products suitable for consumption. However, trace analysis in the presence of a dominating component can be a challenging task because of noncompatible linear detection ranges or strong signal overlap that suppresses the signal of interest. Here, we developed a technique for quantitative analysis using multiplexing gas chromatography (mpGC) for continuous and completely automated process trace analytics exemplified for the analysis of a CO 2 stream in a production plant for detection of benzene, toluene, ethylbenzene, and the three structural isomers of xylene (BTEX) in the concentration range of 0-10 ppb. Additional minor components are methane and methanol with concentrations up to 100 ppm. The sample is injected up to 512 times according to a pseudorandom binary sequence (PRBS) with a mean frequency of 0.1 Hz into a gas chromatograph equipped with a flame ionization detector (FID). A superimposed chromatogram is recorded which is deconvoluted into an averaged chromatogram with Hadamard transformation. Novel algorithms to maintain the data acquisition rate of the detector by application of Hadamard transformation and to suppress correlation noise induced by components with much higher concentrations than the target substances are shown. Compared to conventional GC-FID, the signal-to-noise ratio has been increased by a factor of 10 with mpGC-FID. Correspondingly, the detection limits for BTEX in CO 2 have been lowered from 10 to 1 ppb each. This has been achieved despite the presence of detectable components (methane and methanol) with a concentration about 1000 times higher than the target substances. The robustness and reliability of mpGC has been proven in a two-month field test in a chemical production plant.

  2. Large-Scale Image Analytics Using Deep Learning

    Science.gov (United States)

    Ganguly, S.; Nemani, R. R.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Votava, P.

    2014-12-01

    High resolution land cover classification maps are needed to increase the accuracy of current Land ecosystem and climate model outputs. Limited studies are in place that demonstrates the state-of-the-art in deriving very high resolution (VHR) land cover products. In addition, most methods heavily rely on commercial softwares that are difficult to scale given the region of study (e.g. continents to globe). Complexities in present approaches relate to (a) scalability of the algorithm, (b) large image data processing (compute and memory intensive), (c) computational cost, (d) massively parallel architecture, and (e) machine learning automation. In addition, VHR satellite datasets are of the order of terabytes and features extracted from these datasets are of the order of petabytes. In our present study, we have acquired the National Agricultural Imaging Program (NAIP) dataset for the Continental United States at a spatial resolution of 1-m. This data comes as image tiles (a total of quarter million image scenes with ~60 million pixels) and has a total size of ~100 terabytes for a single acquisition. Features extracted from the entire dataset would amount to ~8-10 petabytes. In our proposed approach, we have implemented a novel semi-automated machine learning algorithm rooted on the principles of "deep learning" to delineate the percentage of tree cover. In order to perform image analytics in such a granular system, it is mandatory to devise an intelligent archiving and query system for image retrieval, file structuring, metadata processing and filtering of all available image scenes. Using the Open NASA Earth Exchange (NEX) initiative, which is a partnership with Amazon Web Services (AWS), we have developed an end-to-end architecture for designing the database and the deep belief network (following the distbelief computing model) to solve a grand challenge of scaling this process across quarter million NAIP tiles that cover the entire Continental United States. The

  3. Using the Technology of the Confessional as an Analytical Resource: Four Analytical Stances Towards Research Interviews in Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Brendan K. O'Rourke

    2007-05-01

    Full Text Available Among the various approaches that have developed from FOUCAULT's work is an Anglophone discourse analysis that has attempted to combine FOUCAULTian insights with the techniques of Conversation Analysis. An important current methodological issue in this discourse analytical approach is its theoretical preference for "naturally occurring" rather than research interview data. A FOUCAULTian perspective on the interview as a research instrument, questions the idea of "naturally-occurring discourse". The "technology of the confessional" operates, not only within research interviews, but permeates other interactions as well. Drawing on FOUCAULT does not dismiss the problems of the interview as research instrument rather it shows they cannot be escaped by simply switching to more "natural" interactions. Combining these insights with recent developments within discourse analysis can provide analytical resources for, rather than barriers to, the discourse analysis of research interviews. To aid such an approach, we develop a four-way categorisation of analytical stances towards the research interview in discourse analysis. A demonstration of how a research interview might be subjected to a discourse analysis using elements of this approach is then provided. URN: urn:nbn:de:0114-fqs070238

  4. Vibration Based Diagnosis for Planetary Gearboxes Using an Analytical Model

    Directory of Open Access Journals (Sweden)

    Liu Hong

    2016-01-01

    Full Text Available The application of conventional vibration based diagnostic techniques to planetary gearboxes is a challenge because of the complexity of frequency components in the measured spectrum, which is the result of relative motions between the rotary planets and the fixed accelerometer. In practice, since the fault signatures are usually contaminated by noises and vibrations from other mechanical components of gearboxes, the diagnostic efficacy may further deteriorate. Thus, it is essential to develop a novel vibration based scheme to diagnose gear failures for planetary gearboxes. Following a brief literature review, the paper begins with the introduction of an analytical model of planetary gear-sets developed by the authors in previous works, which can predict the distinct behaviors of fault introduced sidebands. This analytical model is easy to implement because the only prerequisite information is the basic geometry of the planetary gear-set. Afterwards, an automated diagnostic scheme is proposed to cope with the challenges associated with the characteristic configuration of planetary gearboxes. The proposed vibration based scheme integrates the analytical model, a denoising algorithm, and frequency domain indicators into one synergistic system for the detection and identification of damaged gear teeth in planetary gearboxes. Its performance is validated with the dynamic simulations and the experimental data from a planetary gearbox test rig.

  5. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  6. Green analytical chemistry - the use of surfactants as a replacement of organic solvents in spectroscopy

    Science.gov (United States)

    Pharr, Daniel Y.

    2017-07-01

    This chapter gives an introduction to the many practical uses of surfactants in analytical chemistry in replacing organic solvents to achieve greener chemistry. Taking a holistic approach, it covers some background of surfactants as chemical solvents, their properties and as green chemicals, including their environmental effects. The achievements of green analytical chemistry with micellar systems are reviewed in all the major areas of analytical chemistry where these reagents have been found to be useful.

  7. Consumer Insight as Competitive Advantage Using Big Data and Analytics

    Directory of Open Access Journals (Sweden)

    Adnan Veysel Ertemel

    2015-12-01

    Full Text Available Digital revolution serves as a competitive advantage to businesses that are able to analyze consumer behavior in order to gain insights for their strategic advantage. After the advent of Internet, the past two decades witnessed generation of vast amount of business data. The amount of data is so huge that traditional database management system approaches falls short of managing and analyzing this data. This paper explores the characteristics of this phenomenon called Big Data together with Analytics as a tool for marketers to gain insights about consumer behavior and hence provide competitive advantage to the businesses. It also discusses some best practices as case studies.

  8. Prediction of polymer flooding performance using an analytical method

    International Nuclear Information System (INIS)

    Tan Czek Hoong; Mariyamni Awang; Foo Kok Wai

    2001-01-01

    The study investigated the applicability of an analytical method developed by El-Khatib in polymer flooding. Results from a simulator UTCHEM and experiments were compared with the El-Khatib prediction method. In general, by assuming a constant viscosity polymer injection, the method gave much higher recovery values than the simulation runs and the experiments. A modification of the method gave better correlation, albeit only oil production. Investigation is continuing on modifying the method so that a better overall fit can be obtained for polymer flooding. (Author)

  9. The use of analytical procedures in the internal audit of the restaurant business expenses

    Directory of Open Access Journals (Sweden)

    T.Yu. Kopotienko

    2015-06-01

    Full Text Available The important task of carrying out the internal audit of expenses is to get the sufficient and reliable audit evidence. This can be achieved by using the analytical procedures in the audit process. The identification of the analytical procedures with the financial analysis of the business activities prevents from the efficient usage of them in the internal audit of the restaurant business expenses. The knowledge of internal auditors about the instructional techniques of analytical procedures and their tasks, depending on the verification steps are insufficient. The purpose of the article is the developing the methods of the internal audit of the restaurant business expenses based on an integrated application of analytical procedures. The nature and purpose of analytical procedures have been investigated in the article. It have been identified the factors influencing on auditor’s decision about the choice of analytical procedures complex. It was recommended to identify among them the purpose of analytical procedures, the type and structure of the enterprise, the source of the available information, the existence of financial and non-financial information, reliability and comparability of the available information. It have been identified the tasks of analytical procedures, depending on the verification steps. It was offered the analytical procedures complex as a part of internal audit of the restaurant business expenses. This complex contains a list of the analytical procedures, instructional techniques of analysis that are used in the appropriate procedure and the brief overview of the content of procedure.

  10. Spatial capture-recapture: a promising method for analyzing data collected using artificial cover objects

    Science.gov (United States)

    Sutherland, Chris; Munoz, David; Miller, David A.W.; Grant, Evan H. Campbell

    2016-01-01

    Spatial capture–recapture (SCR) is a relatively recent development in ecological statistics that provides a spatial context for estimating abundance and space use patterns, and improves inference about absolute population density. SCR has been applied to individual encounter data collected noninvasively using methods such as camera traps, hair snares, and scat surveys. Despite the widespread use of capture-based surveys to monitor amphibians and reptiles, there are few applications of SCR in the herpetological literature. We demonstrate the utility of the application of SCR for studies of reptiles and amphibians by analyzing capture–recapture data from Red-Backed Salamanders, Plethodon cinereus, collected using artificial cover boards. Using SCR to analyze spatial encounter histories of marked individuals, we found evidence that density differed little among four sites within the same forest (on average, 1.59 salamanders/m2) and that salamander detection probability peaked in early October (Julian day 278) reflecting expected surface activity patterns of the species. The spatial scale of detectability, a measure of space use, indicates that the home range size for this population of Red-Backed Salamanders in autumn was 16.89 m2. Surveying reptiles and amphibians using artificial cover boards regularly generates spatial encounter history data of known individuals, which can readily be analyzed using SCR methods, providing estimates of absolute density and inference about the spatial scale of habitat use.

  11. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    Assessment of total uncertainty of analytical methods for the measurements of drugs in human hair has mainly been derived from the analytical variation. However, in hair analysis several other sources of uncertainty will contribute to the total uncertainty. Particularly, in segmental hair analysis pre-analytical variations associated with the sampling and segmentation may be significant factors in the assessment of the total uncertainty budget. The aim of this study was to develop and validate a method for the analysis of 31 common drugs in hair using ultra-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) with focus on the assessment of both the analytical and pre-analytical sampling variations. The validated method was specific, accurate (80-120%), and precise (CV≤20%) across a wide linear concentration range from 0.025-25 ng/mg for most compounds. The analytical variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two bundles of hair collected from each subject after subtraction of the analytical component. For the most frequently detected analytes, the pre-analytical variation was estimated to be 26-69%. Thus, the pre-analytical variation was 3-7 folds larger than the analytical variation (7-13%) and hence the dominant component in the total variation (29-70%). The present study demonstrated the importance of including the pre-analytical variation in the assessment of the total uncertainty budget and in the setting of the 95%-uncertainty interval (±2CVT). Excluding the pre-analytical sampling variation could significantly affect the interpretation of results from segmental hair analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Promising Practices: A Literature Review of Technology Use by Underserved Students

    Science.gov (United States)

    Zielezinski, Molly B.; Darling-Hammond, Linda

    2016-01-01

    How can technologies and digital learning experiences be used to support underserved, under-resourced, and underprepared students? For many years, educators, researchers, and policy makers looking for strategies to close the achievement gap and improve student learning have sought solutions involving new uses of technology, especially for students…

  13. Texting while driving using Google Glass™: Promising but not distraction-free.

    Science.gov (United States)

    He, Jibo; Choi, William; McCarley, Jason S; Chaparro, Barbara S; Wang, Chun

    2015-08-01

    Texting while driving is risky but common. This study evaluated how texting using a Head-Mounted Display, Google Glass, impacts driving performance. Experienced drivers performed a classic car-following task while using three different interfaces to text: fully manual interaction with a head-down smartphone, vocal interaction with a smartphone, and vocal interaction with Google Glass. Fully manual interaction produced worse driving performance than either of the other interaction methods, leading to more lane excursions and variable vehicle control, and higher workload. Compared to texting vocally with a smartphone, texting using Google Glass produced fewer lane excursions, more braking responses, and lower workload. All forms of texting impaired driving performance compared to undistracted driving. These results imply that the use of Google Glass for texting impairs driving, but its Head-Mounted Display configuration and speech recognition technology may be safer than texting using a smartphone. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals

    DEFF Research Database (Denmark)

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G

    2018-01-01

    for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision...... are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation...

  15. The Ethics of Using Learning Analytics to Categorize Students on Risk

    Science.gov (United States)

    Scholes, Vanessa

    2016-01-01

    There are good reasons for higher education institutions to use learning analytics to risk-screen students. Institutions can use learning analytics to better predict which students are at greater risk of dropping out or failing, and use the statistics to treat "risky" students differently. This paper analyses this practice using…

  16. Promises and Pitfalls of Using Social Media in Public E-procurement: an Appraisal

    Directory of Open Access Journals (Sweden)

    Sharif As-Saber

    2014-11-01

    Full Text Available Social Media (SM, in recent years, is emerging as a common platform for low cost information exchange, and has attracted a critical mass of users both at corporate and retail levels. Theoretically, SM can thus be used as a tool to strengthen e-procurement in the public sector. Towards this end, we have prepared a conceptual model drawing on literature reviews and some examples while identifying a set of expected benefits and challenges within four stages of e-procurement. Using the framework, a case study has been conducted involving Australian public procurement initiatives and 15 federal government senior officials engaged in e-procurement. They have been interviewed to shed light on the possibilities and challenges of using SM in the public e-procurement context. The findings of the study suggest a limited scope for SM usage in the Australian public sector e-procurement process. The implications of the findings are discussed and some recommendations offered.

  17. Sample diagnosis using indicator elements and non-analyte signals for inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Antler, Margaret; Ying Hai; Burns, David H.; Salin, Eric D.

    2003-01-01

    A sample diagnosis procedure that uses both non-analyte and analyte signals to estimate matrix effects in inductively coupled plasma-mass spectrometry is presented. Non-analyte signals are those of background species in the plasma (e.g. N + , ArO + ), and changes in these signals can indicate changes in plasma conditions. Matrix effects of Al, Ba, Cs, K and Na on 19 non-analyte signals and 15 element signals were monitored. Multiple linear regression was used to build the prediction models, using a genetic algorithm for objective feature selection. Non-analyte elemental signals and non-analyte signals were compared for diagnosing matrix effects, and both were found to be suitable for estimating matrix effects. Individual analyte matrix effect estimation was compared with the overall matrix effect prediction, and models used to diagnose overall matrix effects were more accurate than individual analyte models. In previous work [Spectrochim. Acta Part B 57 (2002) 277], we tested models for analytical decision making. The current models were tested in the same way, and were able to successfully diagnose matrix effects with at least an 80% success rate

  18. The Concept of Resource Use Efficiency as a Theoretical Basis for Promising Coal Mining Technologies

    Science.gov (United States)

    Mikhalchenko, Vadim

    2017-11-01

    The article is devoted to solving one of the most relevant problems of the coal mining industry - its high resource use efficiency, which results in high environmental and economic costs of operating enterprises. It is shown that it is the high resource use efficiency of traditional, historically developed coal production systems that generates a conflict between indicators of economic efficiency and indicators of resistance to uncertainty and variability of market environment parameters. The traditional technological paradigm of exploitation of coal deposits also predetermines high, technology-driven, economic risks. The solution is shown and a real example of the problem solution is considered.

  19. The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change.

    Directory of Open Access Journals (Sweden)

    Kaitlin T Raimi

    Full Text Available To make informed choices about how to address climate change, members of the public must develop ways to consider established facts of climate science and the uncertainties about its future trajectories, in addition to the risks attendant to various responses, including non-response, to climate change. One method suggested for educating the public about these issues is the use of simple mental models, or analogies comparing climate change to familiar domains such as medical decision making, disaster preparedness, or courtroom trials. Two studies were conducted using online participants in the U.S.A. to test the use of analogies to highlight seven key decision-relevant elements of climate change, including uncertainties about when and where serious damage may occur, its unprecedented and progressive nature, and tradeoffs in limiting climate change. An internal meta-analysis was then conducted to estimate overall effect sizes across the two studies. Analogies were not found to inform knowledge about climate literacy facts. However, results suggested that people found the medical analogy helpful and that it led people-especially political conservatives-to better recognize several decision-relevant attributes of climate change. These effects were weak, perhaps reflecting a well-documented and overwhelming effect of political ideology on climate change communication and education efforts in the U.S.A. The potential of analogies and similar education tools to improve understanding and communication in a polarized political environment are discussed.

  20. The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change.

    Science.gov (United States)

    Raimi, Kaitlin T; Stern, Paul C; Maki, Alexander

    2017-01-01

    To make informed choices about how to address climate change, members of the public must develop ways to consider established facts of climate science and the uncertainties about its future trajectories, in addition to the risks attendant to various responses, including non-response, to climate change. One method suggested for educating the public about these issues is the use of simple mental models, or analogies comparing climate change to familiar domains such as medical decision making, disaster preparedness, or courtroom trials. Two studies were conducted using online participants in the U.S.A. to test the use of analogies to highlight seven key decision-relevant elements of climate change, including uncertainties about when and where serious damage may occur, its unprecedented and progressive nature, and tradeoffs in limiting climate change. An internal meta-analysis was then conducted to estimate overall effect sizes across the two studies. Analogies were not found to inform knowledge about climate literacy facts. However, results suggested that people found the medical analogy helpful and that it led people-especially political conservatives-to better recognize several decision-relevant attributes of climate change. These effects were weak, perhaps reflecting a well-documented and overwhelming effect of political ideology on climate change communication and education efforts in the U.S.A. The potential of analogies and similar education tools to improve understanding and communication in a polarized political environment are discussed.

  1. The Promise and Limitations of Using Analogies to Improve Decision-Relevant Understanding of Climate Change

    Science.gov (United States)

    Stern, Paul C.; Maki, Alexander

    2017-01-01

    To make informed choices about how to address climate change, members of the public must develop ways to consider established facts of climate science and the uncertainties about its future trajectories, in addition to the risks attendant to various responses, including non-response, to climate change. One method suggested for educating the public about these issues is the use of simple mental models, or analogies comparing climate change to familiar domains such as medical decision making, disaster preparedness, or courtroom trials. Two studies were conducted using online participants in the U.S.A. to test the use of analogies to highlight seven key decision-relevant elements of climate change, including uncertainties about when and where serious damage may occur, its unprecedented and progressive nature, and tradeoffs in limiting climate change. An internal meta-analysis was then conducted to estimate overall effect sizes across the two studies. Analogies were not found to inform knowledge about climate literacy facts. However, results suggested that people found the medical analogy helpful and that it led people—especially political conservatives—to better recognize several decision-relevant attributes of climate change. These effects were weak, perhaps reflecting a well-documented and overwhelming effect of political ideology on climate change communication and education efforts in the U.S.A. The potential of analogies and similar education tools to improve understanding and communication in a polarized political environment are discussed. PMID:28135337

  2. Trisoctahedral gold nanocrystal: A promising candidate for the study of plasmonics using cathodoluminescence

    International Nuclear Information System (INIS)

    Maity, Achyut; Maiti, Arpan; Satpati, Biswarup; Chini, Tapas Kumar

    2016-01-01

    We study plasmon assisted luminescence from an isolated single trisoctahedral (TOH) gold (Au) nanocrystal using cathodoluminescence (CL) spectroscopy and imaging in a field emission scanning electron microscope (FESEM). The site specific e-beam excitation reveals a double peaked spectrum with the localized surface plasmon resonance (LSPR) at 540 nm and 660 nm and a single resonant peaked spectrum at 560 nm. The spatial variation of the plasmon assisted luminescence was strongest at the apex points as well as at the edges and corners.

  3. Diatomite and re-use coal waste as promising alternative for fertilizer to environmental improvement

    OpenAIRE

    Mohammad Hassan Sayyari-Zahan; AbdolHamid Gholami; Somayeh Rezaeepour

    2015-01-01

    Application of conventional fertilizers has been contributing much pollutant to the environment. This study aimed to assess the potential of diatomite and re-use coal waste as a non chemical fertilizer to environmental improvement. The experiments were evaluated in 2kg pots under greenhouse conditions at 4 levels of diatomite powder including 0, 10, 20, 40 g/kg soil as well as 5 levels of coal waste powder including 0, 20, 40, 80, 160 g/kg soil based on completely randomized design with three...

  4. The use of nanoparticles as a promising therapeutic approach in cancer immunotherapy.

    Science.gov (United States)

    Hosseini, Maryam; Haji-Fatahaliha, Mostafa; Jadidi-Niaragh, Farhad; Majidi, Jafar; Yousefi, Mehdi

    2016-06-01

    Cancer is one of the most important causes of death all over the world, which has not yet been treated efficiently. Although several therapeutic approaches have been used, some side effects such as toxicity and drug resistance have been observed in patients, particularly with chemotherapy. The nanoparticle-mediated drug delivery systems (DDS) have a great potential to improve cancer treatment by transferring therapeutic factors directly to the tumor site. Such a treatment significantly decreases the adverse effects associated with cancer therapy on healthy tissues. Two main strategies, including passive and active methods, have been considered to be effective techniques which can target the drugs to the tumor sites. The current review sheds some light on the place of nanotechnology in cancer drug delivery, and introduces nanomaterials and their specific characteristics that can be used in tumor therapy. Moreover, passive and active targeting approaches focus on antibodies, particularly single chain variable fragments (scFv), as a novel and important ligand in a drug delivery system.

  5. Targeted radiotherapy of osteosarcoma using 153Sm-EDTMP. A new promising approach

    International Nuclear Information System (INIS)

    Bruland, Oe.S.; Skretting, A.; Solheim, Oe.P.; Aas, M.

    1996-01-01

    We report a case where targeted radionuclide therapy using 153 Sm-EDTMP gave substantial palliative effect. A 35-year-old male with a primary osteosarcoma located in the first lumbar vertebra relapsed with progressive back pain after conventional treatment modalities had failed. He became bedridden, and developed paraparesis and impaired bladder function. On a diagnostic bone-scan intense radioactivity was localized in the tumor. He therefore was given 153 Sm-EDTMP treatment twice, 8 weeks apart, 35 and 32 MBq/kg body weight respectively. After a few days the pain was significantly relieved and by the second radionuclide treatment the pareses subsided. For six months he was able to be up and about without any neurological signs or detectable metastases. Eventually, however, he experienced increasing local pain, developed paraparesis, was re-operated but died 4 months later. The dramatic transient improvement observed in this case warrants further exploration using 153 Sm-EDTMP as a boost technique, supplementary to conventiontal external radiotherapy. (orig.)

  6. Targeted radiotherapy of osteosarcoma using {sup 153}Sm-EDTMP. A new promising approach

    Energy Technology Data Exchange (ETDEWEB)

    Bruland, Oe.S. [Dept. of Medical Oncology and Radiotherapy, Norwegian Radium Hospital, Oslo (Norway); Skretting, A. [Dept. of Medical Physics and Technology, Norwegian Radium Hospital, Oslo (Norway); Solheim, Oe.P. [Dept. of Medical Oncology and Radiotherapy, Norwegian Radium Hospital, Oslo (Norway); Aas, M. [Dept. of Nuclear Medicine, Norwegian Radium Hospital, Oslo (Norway)

    1996-10-01

    We report a case where targeted radionuclide therapy using {sup 153}Sm-EDTMP gave substantial palliative effect. A 35-year-old male with a primary osteosarcoma located in the first lumbar vertebra relapsed with progressive back pain after conventional treatment modalities had failed. He became bedridden, and developed paraparesis and impaired bladder function. On a diagnostic bone-scan intense radioactivity was localized in the tumor. He therefore was given {sup 153}Sm-EDTMP treatment twice, 8 weeks apart, 35 and 32 MBq/kg body weight respectively. After a few days the pain was significantly relieved and by the second radionuclide treatment the pareses subsided. For six months he was able to be up and about without any neurological signs or detectable metastases. Eventually, however, he experienced increasing local pain, developed paraparesis, was re-operated but died 4 months later. The dramatic transient improvement observed in this case warrants further exploration using {sup 153}Sm-EDTMP as a boost technique, supplementary to conventiontal external radiotherapy. (orig.).

  7. Recurrent urinary tract infections in women: How promising is the use of probiotics?

    Directory of Open Access Journals (Sweden)

    Varsha Gupta

    2017-01-01

    Full Text Available Urinary tract infections (UTIs currently rank amongst the most prevalent bacterial infections, representing a major health hazard. UTIs in females usually start as vaginal infections and ascend to the urethra and bladder. Recurrent UTIs (rUTIs can be defined as at least three episodes of UTI in 1 year or two episodes in 6 months. Various antibiotics have been the mainstay of therapy in ameliorating the incidence of UTIs, but recurrent infections continue to afflict many women. It necessitates the exploitation of alternative antimicrobial therapy. Probiotics have been shown to be effective in varied clinical trials for long-term preventions of rUTI. Because Escherichia coli is the primary pathogen involved in UTIs which spreads from the rectum to vagina and then ascends up the sterile urinary tract, improving the gut or vaginal flora will thus impact the urinary tract. Since a healthy vaginal microbiota is mainly dominated by Lactobacillus species, in this context, exogenously administered probiotics containing Lactobacilli play a pivotal role in reducing the risk of rUTI. The concept of artificially boosting the Lactobacilli numbers through probiotic administration has long been conceived but has been recently shown to be possible. Lactobacilli may especially be useful for women with a history of recurrent, complicated UTIs or on prolonged antibiotic use. Probiotics do not cause antibiotic resistance and may offer other health benefits due to vaginal re-colonisation with Lactobacilli. However, more comprehensive research is still needed, to recommend for probiotics as an alternative to antibiotics.

  8. Recurrent urinary tract infections in women: How promising is the use of probiotics?

    Science.gov (United States)

    Gupta, Varsha; Nag, Deepika; Garg, Pratibha

    2017-01-01

    Urinary tract infections (UTIs) currently rank amongst the most prevalent bacterial infections, representing a major health hazard. UTIs in females usually start as vaginal infections and ascend to the urethra and bladder. Recurrent UTIs (rUTIs) can be defined as at least three episodes of UTI in 1 year or two episodes in 6 months. Various antibiotics have been the mainstay of therapy in ameliorating the incidence of UTIs, but recurrent infections continue to afflict many women. It necessitates the exploitation of alternative antimicrobial therapy. Probiotics have been shown to be effective in varied clinical trials for long-term preventions of rUTI. Because Escherichia coli is the primary pathogen involved in UTIs which spreads from the rectum to vagina and then ascends up the sterile urinary tract, improving the gut or vaginal flora will thus impact the urinary tract. Since a healthy vaginal microbiota is mainly dominated by Lactobacillus species, in this context, exogenously administered probiotics containing Lactobacilli play a pivotal role in reducing the risk of rUTI. The concept of artificially boosting the Lactobacilli numbers through probiotic administration has long been conceived but has been recently shown to be possible. Lactobacilli may especially be useful for women with a history of recurrent, complicated UTIs or on prolonged antibiotic use. Probiotics do not cause antibiotic resistance and may offer other health benefits due to vaginal re-colonisation with Lactobacilli. However, more comprehensive research is still needed, to recommend for probiotics as an alternative to antibiotics.

  9. Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes to Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes

    Science.gov (United States)

    Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.

    2016-01-01

    This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…

  10. THE EDUCATIONAL USE OF SOCIAL NETWORKING WEBSITES: FROM PROMISE TO REALITY

    Directory of Open Access Journals (Sweden)

    Costin Pribeanu

    2016-06-01

    Full Text Available The platforms supporting social networking activities on the Internet are applications for the creation, sharing and exchange of user-generated content that manifests in various forms. Users can freely express their ideas and opinions, and have opportunities to launch and participate in collaborative projects and virtual communities. Facebook (FB is a social networking website featuring an explosive growth in the last years and an increased popularity among university students. For example, the number of Facebook users in Romania was 8.5 million in June 2016 (Facebrands.Ro, 2015 out of which 33% are young people of 15-24 years old. Recent research on the Facebook use shows that Romanian university students have large Facebook networks and spend a lot of minutes per day (Pribeanu & Lamanauskas, 2016.

  11. Analysis of protein carbonylation - pitfalls and promise in commonly used methods

    DEFF Research Database (Denmark)

    Rogowska-Wrzesinska, A.; Wojdyla, K.; Nedic, O.

    2014-01-01

    that research scientists are becoming more eager to be able to measure accurately the level of oxidized protein in biological materials, and to determine the precise site of the oxidative attack on the protein, in order to get insights into the molecular mechanisms involved in the progression of diseases....... Several methods for measuring protein carbonylation have been implemented in different laboratories around the world. However, to date no methods prevail as the most accurate, reliable, and robust. The present paper aims at giving an overview of the common methods used to determine protein carbonylation...... in biological material as well as to highlight the limitations and the potential. The ultimate goal is to give quick tips for a rapid decision making when a method has to be selected and taking into consideration the advantage and drawback of the methods....

  12. Analysis of protein carbonylation-pitfalls and promise in commonly used methods

    DEFF Research Database (Denmark)

    Rogowska-Wrzesinska, Adelina; Wojdyla, K; Nedić, O

    2014-01-01

    that research scientists are becoming more eager to be able to measure accurately the level of oxidized protein in biological materials, and to determine the precise site of the oxidative attack on the protein, in order to get insights into the molecular mechanisms involved in the progression of diseases....... Several methods for measuring protein carbonylation have been implemented in different laboratories around the world. However, to date no methods prevail as the most accurate, reliable, and robust. The present paper aims at giving an overview of the common methods used to determine protein carbonylation...... in biological material as well as to highlight the limitations and the potential. The ultimate goal is to give quick tips for a rapid decision making when a method has to be selected and taking into consideration the advantage and drawback of the methods....

  13. Binding assays with streptavidin-functionalized superparamagnetic nanoparticles and biotinylated analytes using fluxgate magnetorelaxometry

    International Nuclear Information System (INIS)

    Heim, Erik; Ludwig, Frank; Schilling, Meinhard

    2009-01-01

    Binding assays based on the magnetorelaxation of superparamagnetic nanoparticles as markers are presented utilizing a differential fluxgate system. As ligand and receptor, streptavidin and biotin, respectively, are used. Superparamagnetic nanoparticles are functionalized with streptavidin and bound to two types of biotinylated analytes: agarose beads and bovine serum (BSA) proteins. The size difference of the two analytes causes a different progress of the reaction. As a consequence, the analysis of the relaxation signal is carried out dissimilarly for the two analytes. In addition, we studied the reaction kinetics of the two kinds of analytes with the fluxgate system.

  14. Promise and Pitfalls of Using Grain Size Analysis to Identify Glacial Sediments in Alpine Lake Cores.

    Science.gov (United States)

    Clark, D. H.

    2011-12-01

    Lakes fed by glacier outwash should have a clastic particle-size record distinct from non-glacial lakes in the same area, but do they? The unique turquoise color of alpine glacial lakes reflects the flux of suspended clastic glacial rock flour to those lakes; conversely, lakes not fed by outwash are generally clear with sediments dominated by organics or slope-wash from nearby hillslopes. This contrast in sediment types and sources should produce a distinct and measureable different in grain sizes between the two settings. Results from a variety of lakes suggest the actual situation is often more subtle and complex. I compare grain size results to other proxies to assess the value of grain size analysis for paleoglacier studies. Over the past 10 years, my colleagues and I have collected and analyzed sediment cores from a wide variety of lakes below small alpine glaciers in an attempt to constrain the timing and magnitude of alpine glaciation in those basins. The basic concept is that these lakes act as continuous catchments for any rock flour produced upstream by glacier abrasion; as a glacier grows, the flux of rock flour to the lake will also increase. If the glacier disappears entirely, rock flour deposition will also cease in short order. We have focused our research in basins with simple sedimentologic settings: mostly small, high-altitude, stripped granitic or metamorphic cirques in which the cirque glaciers are the primary source of clastic sediments. In most cases, the lakes are fed by meltwater from a modern glacier, but were ice free during the earlier Holocene. In such cases, the lake cores should record formation of and changes in activity of the glacier upstream. We used a Malvern Mastersizer 2000 laser particle size analyzer for our grain size analyses, as well as recording magnetic susceptibility, color, and organics for the same cores. The results indicate that although lakes often experience increases in silt and clay-size (<0.63 mm) clastic

  15. Trimetazidine: IS it a promising drug for use in steatotic grafts?

    Institute of Scientific and Technical Information of China (English)

    Ismail Ben Mosbah; Araní Casillas-Ramírez; Carme Xaus; Anna Serafín; Joan Roselló-Catafau; Carmen Peralta

    2006-01-01

    AIM: Chronic organ-donor shortage has led to the acceptance of steatotic livers for transplantation, despite the higher risk of graft dysfunction or nonfunction associated with the ischemic preservation period of these organs. The present study evaluates the effects of trimetazidine (TMZ) on an isolated perfused liver model.METHODS: Steatotic and non-steatotic livers were preserved for 24 h in the University of Wisconsin (UW)solution with or without TMZ. Hepatic injury and function (transaminases, bile production and sulfobromophthalein (BSP) clearance) and factors potentially involved in the susceptibility of steatotic livers to ischemia-reperfusion (I/R) injury, including oxidative stress, mitochondrial damage, microcirculatory diseases, and ATP depletion were evaluated.RESULTS: Steatotic livers preserved in UW solution showed higher transaminase levels, lower bile production and BSP clearance compared with non-steatotic livers.Alterations in perfusion flow rate and vascular resistance,mitochondrial damage, and reduced ATP content were more evident in steatotic livers. TMZ addition to UW solution reduced hepatic injury and ameliorated hepatic functionality in both types of the liver and protected against the mechanisms potentially responsible for the poor tolerance of steatotic livers to I/R.CONCLUSION: TMZ may constitute a useful approach in fatty liver surgery, limiting the inherent risk of steatotic liver failure following transplantation.

  16. Quantification of growth, yield and radiation use efficiency of promising cotton cultivars at varying nitrogen levels

    International Nuclear Information System (INIS)

    Wajid, A.; Ahmad, A.; Khaliq, T.; Alam, S.; Hussaun, A.; Hussain, K.; Naseem, W.; Usman, M.; Ahmad, S.

    2010-01-01

    Cotton cultivars response to different doses of nitrogen for radiation interception, canopy development, growth and seed yield were studied in 2006. The experiment was laid out in randomized complete block design with split arrangement under the climatic conditions of Bahawalpur. Data on seed yield, total dry matter (TDM), leaf area index (LAI), fraction of intercepted radiation (Fi), accumulated radiation interception during the growth season (Sa) and radiation use efficiency (RUE) were taken into account. TDM pattern showed sigmoid growth curve for both cultivars and nitrogen levels and showed strong relationship (R2 = 0.98) with the accumulated intercepted radiation (Sa) for the season. Mean maximum value of fraction of incident PAR (Fi) remained 90% at 120 days after sowing (DAS) harvest due to maximum crop canopy development. Cultivar NIAB-111 produced 0.81 g m/sup -2/ of TDM for each MJ of accumulated PAR and nitrogen at the rate of 185 kg ha/sup -1/ statistically proved to be better in converting radiation into dry matter production. (author)

  17. Renal Denervation Using an Irrigated Catheter in Patients with Resistant Hypertension: A Promising Strategy?

    International Nuclear Information System (INIS)

    Armaganijan, Luciana; Staico, Rodolfo; Moraes, Aline; Abizaid, Alexandre; Moreira, Dalmo; Amodeo, Celso; Sousa, Márcio; Borelli, Flávio; Armaganijan, Dikran; Sousa, J. Eduardo; Sousa, Amanda

    2014-01-01

    Systemic hypertension is an important public health problem and a significant cause of cardiovascular mortality. Its high prevalence and the low rates of blood pressure control have resulted in the search for alternative therapeutic strategies. Percutaneous renal sympathetic denervation emerged as a perspective in the treatment of patients with resistant hypertension. To evaluate the feasibility and safety of renal denervation using an irrigated catheter. Ten patients with resistant hypertension underwent the procedure. The primary endpoint was safety, as assessed by periprocedural adverse events, renal function and renal vascular abnormalities at 6 months. The secondary endpoints were changes in blood pressure levels (office and ambulatory monitoring) and in the number of antihypertensive drugs at 6 months. The mean age was 47.3 (± 12) years, and 90% of patients were women. In the first case, renal artery dissection occurred as a result of trauma due to the long sheath; no further cases were observed after technical adjustments, thus showing an effect of the learning curve. No cases of thrombosis/renal infarction or death were reported. Elevation of serum creatinine levels was not observed during follow-up. At 6 months, one case of significant renal artery stenosis with no clinical consequences was diagnosed. Renal denervation reduced office blood pressure levels by 14.6/6.6 mmHg, on average (p = 0.4 both for systolic and diastolic blood pressure). Blood pressure levels on ambulatory monitoring decreased by 28/17.6 mmHg (p = 0.02 and p = 0.07 for systolic and diastolic blood pressure, respectively). A mean reduction of 2.1 antihypertensive drugs was observed. Renal denervation is feasible and safe in the treatment of resistant systemic arterial hypertension. Larger studies are required to confirm our findings

  18. Renal Denervation Using an Irrigated Catheter in Patients with Resistant Hypertension: A Promising Strategy?

    Energy Technology Data Exchange (ETDEWEB)

    Armaganijan, Luciana, E-mail: luciana-va@hotmail.com; Staico, Rodolfo; Moraes, Aline; Abizaid, Alexandre; Moreira, Dalmo; Amodeo, Celso; Sousa, Márcio; Borelli, Flávio; Armaganijan, Dikran; Sousa, J. Eduardo; Sousa, Amanda [Instituto Dante Pazzanese de Cardiologia, São Paulo, SP (Brazil)

    2014-04-15

    Systemic hypertension is an important public health problem and a significant cause of cardiovascular mortality. Its high prevalence and the low rates of blood pressure control have resulted in the search for alternative therapeutic strategies. Percutaneous renal sympathetic denervation emerged as a perspective in the treatment of patients with resistant hypertension. To evaluate the feasibility and safety of renal denervation using an irrigated catheter. Ten patients with resistant hypertension underwent the procedure. The primary endpoint was safety, as assessed by periprocedural adverse events, renal function and renal vascular abnormalities at 6 months. The secondary endpoints were changes in blood pressure levels (office and ambulatory monitoring) and in the number of antihypertensive drugs at 6 months. The mean age was 47.3 (± 12) years, and 90% of patients were women. In the first case, renal artery dissection occurred as a result of trauma due to the long sheath; no further cases were observed after technical adjustments, thus showing an effect of the learning curve. No cases of thrombosis/renal infarction or death were reported. Elevation of serum creatinine levels was not observed during follow-up. At 6 months, one case of significant renal artery stenosis with no clinical consequences was diagnosed. Renal denervation reduced office blood pressure levels by 14.6/6.6 mmHg, on average (p = 0.4 both for systolic and diastolic blood pressure). Blood pressure levels on ambulatory monitoring decreased by 28/17.6 mmHg (p = 0.02 and p = 0.07 for systolic and diastolic blood pressure, respectively). A mean reduction of 2.1 antihypertensive drugs was observed. Renal denervation is feasible and safe in the treatment of resistant systemic arterial hypertension. Larger studies are required to confirm our findings.

  19. PROMISE: parallel-imaging and compressed-sensing reconstruction of multicontrast imaging using SharablE information.

    Science.gov (United States)

    Gong, Enhao; Huang, Feng; Ying, Kui; Wu, Wenchuan; Wang, Shi; Yuan, Chun

    2015-02-01

    A typical clinical MR examination includes multiple scans to acquire images with different contrasts for complementary diagnostic information. The multicontrast scheme requires long scanning time. The combination of partially parallel imaging and compressed sensing (CS-PPI) has been used to reconstruct accelerated scans. However, there are several unsolved problems in existing methods. The target of this work is to improve existing CS-PPI methods for multicontrast imaging, especially for two-dimensional imaging. If the same field of view is scanned in multicontrast imaging, there is significant amount of sharable information. It is proposed in this study to use manifold sharable information among multicontrast images to enhance CS-PPI in a sequential way. Coil sensitivity information and structure based adaptive regularization, which were extracted from previously reconstructed images, were applied to enhance the following reconstructions. The proposed method is called Parallel-imaging and compressed-sensing Reconstruction Of Multicontrast Imaging using SharablE information (PROMISE). Using L1 -SPIRiT as a CS-PPI example, results on multicontrast brain and carotid scans demonstrated that lower error level and better detail preservation can be achieved by exploiting manifold sharable information. Besides, the privilege of PROMISE still exists while there is interscan motion. Using the sharable information among multicontrast images can enhance CS-PPI with tolerance to motions. © 2014 Wiley Periodicals, Inc.

  20. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    Science.gov (United States)

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  1. Procedure for hazards analysis of plutonium gloveboxes used in analytical chemistry operations

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1977-06-01

    A procedure is presented to identify and assess hazards associated with gloveboxes used for analytical chemistry operations involving plutonium. This procedure is based upon analytic tree methodology and it has been adapted from the US Energy Research and Development Administration's safety program, the Management Oversight and Risk Tree

  2. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  3. Promising More Information

    Science.gov (United States)

    2003-01-01

    When NASA needed a real-time, online database system capable of tracking documentation changes in its propulsion test facilities, engineers at Stennis Space Center joined with ECT International, of Brookfield, Wisconsin, to create a solution. Through NASA's Dual-Use Program, ECT developed Exdata, a software program that works within the company's existing Promise software. Exdata not only satisfied NASA s requirements, but also expanded ECT s commercial product line. Promise, ECT s primary product, is an intelligent software program with specialized functions for designing and documenting electrical control systems. An addon to AutoCAD software, Promis e generates control system schematics, panel layouts, bills of material, wire lists, and terminal plans. The drawing functions include symbol libraries, macros, and automatic line breaking. Primary Promise customers include manufacturing companies, utilities, and other organizations with complex processes to control.

  4. BIG DATA ANALYTICS USE IN CUSTOMER RELATIONSHIP MANAGEMENT: ANTECEDENTS AND PERFORMANCE IMPLICATIONS

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2016-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study aims to (1) determine whether organizational BD use improves customer-centric and financial outcomes, and (2) identify the factors influencing BD use. Drawing primarily from market...

  5. Identification of potentially safe promising fungal cell factories for the production of polyketide natural food colorants using chemotaxonomic rationale

    DEFF Research Database (Denmark)

    Mapari, Sameer Shamsuddin; Meyer, Anne S.; Thrane, Ulf

    2009-01-01

    Background: Colorants derived from natural sources look set to overtake synthetic colorants in market value as manufacturers continue to meet the rising demand for clean label ingredients-particularly in food applications. Many ascomycetous fungi naturally synthesize and secrete pigments and thus...... has not yet been examined in detail. In addition, 4 out of the 10 chemotaxonomically selected promising Penicillium strains were shown to produce extracellular pigments in the liquid media using a solid support indicating future cell factory possibilities for polyketide natural food colorants....... provide readily available additional and/or alternative sources of natural colorants that are independent of agro-climatic conditions. With an appropriately selected fungus; using in particular chemotaxonomy as a guide, the fungal natural colorants could be produced in high yields by using the optimized...

  6. USING THE ANALYTICAL HIERARCHY PROCESS TO SUPPORT SUSTAINABLE USE OF GEO-RESOURCES IN METROPOLITAN AREAS

    Institute of Scientific and Technical Information of China (English)

    Oswald MARINONI; Andreas HOPPE

    2006-01-01

    Sand and gravel are important raw materials which are needed for many civil engineering projects.Due to economic reasons, sand and gravelpits are frequently located in the periphery of metropolitan areas which are often subject to competing land-use interests. As a contribution to land-use conflict solving, the Analytic Hierarchy Process (AHP) is applied within a Geographic Information System (GIS) environment. Two AHP preference matrix scenario constellations are evaluated and their results are used to create a land-use conflict map.

  7. Energy demand analytics using coupled technological and economic models

    Science.gov (United States)

    Impacts of a range of policy scenarios on end-use energy demand are examined using a coupling of MARKAL, an energy system model with extensive supply and end-use technological detail, with Inforum LIFT, a large-scale model of the us. economy with inter-industry, government, and c...

  8. Analysis of Decisions Made Using the Analytic Hierarchy Process

    Science.gov (United States)

    2013-09-01

    country petroleum pipelines (Dey, 2003), deciding how best to manage U.S. watersheds (De Steiguer, Duberstein, and Lopes, 2003), and the U. S. Army...many benefits to its use. Primarily these fall under the heading of managing chaos. Specifically, the AHP is a tool that can be used to simplify and...originally. The commonly used scenario is this: the waiter asks if you want chicken or fish, and you reply fish. The waiter then remembers that steak is

  9. Improving web site performance using commercially available analytical tools.

    Science.gov (United States)

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  10. Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; Keller, J.; Wallen, R.; Errichello, R.; Halse, C.; Lambert, S.

    2015-02-01

    Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.

  11. An Evaluative Methodology for Virtual Communities Using Web Analytics

    Science.gov (United States)

    Phippen, A. D.

    2004-01-01

    The evaluation of virtual community usage and user behaviour has its roots in social science approaches such as interview, document analysis and survey. Little evaluation is carried out using traffic or protocol analysis. Business approaches to evaluating customer/business web site usage are more advanced, in particular using advanced web…

  12. Using an analytical geometry method to improve tiltmeter data presentation

    Science.gov (United States)

    Su, W.-J.

    2000-01-01

    The tiltmeter is a useful tool for geologic and geotechnical applications. To obtain full benefit from the tiltmeter, easy and accurate data presentations should be used. Unfortunately, the most commonly used method for tilt data reduction now may yield inaccurate and low-resolution results. This article describes a simple, accurate, and high-resolution approach developed at the Illinois State Geological Survey for data reduction and presentation. The orientation of tiltplates is determined first by using a trigonometric relationship, followed by a matrix transformation, to obtain the true amount of rotation change of the tiltplate at any given time. The mathematical derivations used for the determination and transformation are then coded into an integrated PC application by adapting the capabilities of commercial spreadsheet, database, and graphics software. Examples of data presentation from tiltmeter applications in studies of landfill covers, characterizations of mine subsidence, and investigations of slope stability are also discussed.

  13. GIS-based Geotechnical Microzonation Mapping using Analytic ...

    African Journals Online (AJOL)

    Bheema

    based ... of parameters that are important for different multivariate decision making ... The methodology used in this study is shown in the flow chart (Fig 4). .... layer was prepared by considering the highest elevations of the static water levels.

  14. Supplier Selection Using Analytical Hierarchy Process at PT. Indolakto

    OpenAIRE

    Anggani, Putri Candra; Baihaqi, Imam

    2017-01-01

    Dairy supply chain is one of food supply chain that has its own uncertainty both in upstream and downstream process due to the durability of product. Dairy market has good demand trend, because the supply is still below the consumption level. Indonesia use imported dairy product rather than use the domestic ones, because the supply of domestic dairy still below the demand. So, there are opportunities for dairy company to compete in this industry and reach competitive advantage by solving the ...

  15. Using a Merit-Based Scholarship Program to Increase Rates of College Enrollment in an Urban School District: The Case of the Pittsburgh Promise

    Science.gov (United States)

    Bozick, Robert; Gonzalez, Gabriella; Engberg, John

    2015-01-01

    The Pittsburgh Promise is a scholarship program that provides $5,000 per year toward college tuition for public high school graduates in Pittsburgh, Pennsylvania who earned a 2.5 GPA and a 90% attendance record. This study used a difference-in-difference design to assess whether the introduction of the Promise scholarship program directly…

  16. Use of tetracycline as complexing agent in analytical chemistry

    International Nuclear Information System (INIS)

    Nastasi, M.J.C.; Saiki, M.; Lima, F.W.

    1977-01-01

    The behavior of tetracyline as complexing agent in solvent extraction studies is presented. The extraction curves for the lanthanide elements, scandium, thorium, uranium and neptunium, are determined for the extraction system benzyl alcohol-tetracycline, as well as the acid and extractant dependences of extraction of the lanthanide elements. Separation of neptunium from uranium is formed by carrying out the extraction experiment at a proper pH value. Use is made of masking agents namely, ethylenediaminetetraacetic acid (EDTA) and diethylenetriaminepentaacetic acid (DTPA), in order to obtain separations of uranium from scadium and lanthanides as well as of uranium and thorium, respectively. The extraction experiments are carried out by using radioisotopes of each element, except for uranium in which case the determinations are by using epithermal neutron activation analysis [pt

  17. Use of crown compounds and cryptands in analytical chemistry

    International Nuclear Information System (INIS)

    Blazius, Eh.; Yansen, K.P.

    1988-01-01

    Possibilities of crown compound and crypton application in amalytical chemistry for separation (extraction, chromatography) and determination of different cations and anions are considered. It is marked that monomeric cyclic polyethers are mainly used for separation and determination of alkali and alkaline earth metals. Linear polymers of cyclic polyethers are exclusively used for extraction of their salts. Cross-linked polymeric cyclic polyethers permit to carry out the separation and determination of most of cations (including transition, rare earth elements, actinides), anions and organic compounds. 99 refs.; 10 figs.; 8 tabs

  18. A manual of analytical methods used at MINTEK

    International Nuclear Information System (INIS)

    Stoch, H.; Dixon, K.

    1983-01-01

    The manual deals with various methods for a wide range of elemental analysis. Some of the methods that are used, include atomic absorption spectroscopy, optical emission spectroscopy and x-ray fluoresence spectroscopy. The basic charateristics of the method are given and the procedures are recorded step by step. One of the sections deals with methods associated with the recovery of uranium

  19. An Analytics Approach to Adaptive Maturity Models using Organizational Characteristics

    NARCIS (Netherlands)

    Baars, T.; Mijnhardt, F.; Vlaanderen, K.; Spruit, M.

    2016-01-01

    Ever since the first incarnations of maturity models, critics have voiced several concerns with these frameworks. Indeed, a lack of model fit and oversimplification of the real world can be attributed to the rigidity of these models, which assumes that each organization that uses the framework is

  20. GIS-based Geotechnical Microzonation Mapping using Analytic ...

    African Journals Online (AJOL)

    If you would like more information about how to print, save, and work with PDFs, Highwire Press provides a helpful Frequently Asked Questions about PDFs. Alternatively, you can download the PDF file directly to your computer, from where it can be opened using a PDF reader. To download the PDF, click the Download link ...

  1. Air pollution in Thailand using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    Leelhaphunt, N.; Chueinta, W.

    1994-01-01

    The methods of neutron activation, both instrumental and radiochemical, and atomic absorption spectrophotometry are used in a study of the concentrations of Al, As, Br, Cd, Cl, Co, Cr, Cu, Fe, Hg, Mn, Sb, Sc, Se, Si, V, Zn and Pb in airborne particulate matter collected from 7 permanent and 9 temporary air quality monitoring stations. The location of the stations are urban residential, suburban residential, mixed (commercial and residential), commercial and industrial areas and near major roads in Bangkok Metropolitan areas. Air sampling is performed once a month for 24 hours continuously using the high volume air sampler (GMW 2000 H) and for 5, 10, and 15 days continuously using an Anderson Air Sampler (SIBATA AN-200). The elements As, Cd and Cu are determined destructively using ion exchange chromatography while Hg and Se are determined by the dry combustion technique. The determination of Pb was done by atomic absorption spectrophotometry. The results of Pb concentrations in airborne particulate matters, collected during 1987 to 1991, were reported by the Office of the National Environment Board. Levels of Pb content were found to be lower than the National Ambient Air Quality Standards. (author). 3 refs, 4 tabs

  2. 1 Mapping Debris Flow Susceptibility using Analytical Network ...

    Indian Academy of Sciences (India)

    54

    activities in the past. In this study, one of the less explored heuristic methods known. 10 ...... can be associated in any prospective way, through feedbacks and inter-relationships. 25 ..... means of multivariate statistical techniques. .... susceptibility assessment using GIS and bivariate statistics: a case study in southern. 9. Italy.

  3. Analytical solutions of weakly coupled map lattices using recurrence relations

    Energy Technology Data Exchange (ETDEWEB)

    Sotelo Herrera, Dolores, E-mail: dsh@dfmf.uned.e [Applied Maths, EUITI, UPM, Ronda de Valencia, 3-28012 Madrid (Spain); San Martin, Jesus [Applied Maths, EUITI, UPM, Ronda de Valencia, 3-28012 Madrid (Spain); Dep. Fisica Matematica y de Fluidos, UNED, Senda del Rey 9-28040 Madrid (Spain)

    2009-07-20

    By using asymptotic methods recurrence relations are found that rule weakly CML evolution, with both global and diffusive coupling. The solutions obtained from these relations are very general because they do not hold restrictions about boundary conditions, initial conditions and number of oscilators in the CML. Furthermore, oscillators are ruled by an arbitraty C{sup 2} function.

  4. Assessing Vocal Performances Using Analytical Assessment: A Case Study

    Science.gov (United States)

    Gynnild, Vidar

    2016-01-01

    This study investigated ways to improve the appraisal of vocal performances within a national academy of music. Since a criterion-based assessment framework had already been adopted, the conceptual foundation of an assessment rubric was used as a guide in an action research project. The group of teachers involved wanted to explore thinking…

  5. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    Science.gov (United States)

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  6. Query optimization for graph analytics on linked data using SPARQL

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Seokyong [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Sangkeun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lim, Seung -Hwan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sukumar, Sreenivas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Vatsavai, Ranga Raju [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-07-01

    Triplestores that support query languages such as SPARQL are emerging as the preferred and scalable solution to represent data and meta-data as massive heterogeneous graphs using Semantic Web standards. With increasing adoption, the desire to conduct graph-theoretic mining and exploratory analysis has also increased. Addressing that desire, this paper presents a solution that is the marriage of Graph Theory and the Semantic Web. We present software that can analyze Linked Data using graph operations such as counting triangles, finding eccentricity, testing connectedness, and computing PageRank directly on triple stores via the SPARQL interface. We describe the process of optimizing performance of the SPARQL-based implementation of such popular graph algorithms by reducing the space-overhead, simplifying iterative complexity and removing redundant computations by understanding query plans. Our optimized approach shows significant performance gains on triplestores hosted on stand-alone workstations as well as hardware-optimized scalable supercomputers such as the Cray XMT.

  7. Cost Optimization of Product Families using Analytic Cost Models

    DEFF Research Database (Denmark)

    Brunø, Thomas Ditlev; Nielsen, Peter

    2012-01-01

    This paper presents a new method for analysing the cost structure of a mass customized product family. The method uses linear regression and backwards selection to reduce the complexity of a data set describing a number of historical product configurations and incurred costs. By reducing the data...... set, the configuration variables which best describe the variation in product costs are identified. The method is tested using data from a Danish manufacturing company and the results indicate that the method is able to identify the most critical configuration variables. The method can be applied...... in product family redesign projects focusing on cost reduction to identify which modules contribute the most to cost variation and should thus be optimized....

  8. Using social media to communicate during crises: an analytic methodology

    Science.gov (United States)

    Greene, Marjorie

    2011-06-01

    The Emerging Media Integration Team at the Department of the Navy Office of Information (CHINFO) has recently put together a Navy Command Social Media Handbook designed to provide information needed to safely and effectively use social media. While not intended to be a comprehensive guide on command use of social media or to take the place of official policy, the Handbook provides a useful guide for navigating a dynamic communications environment. Social media are changing the way information is diffused and decisions are made, especially for Humanitarian Assistance missions when there is increased emphasis on Navy commands to share critical information with other Navy command sites, government, and official NGO (nongovernmental organization) sites like the American Red Cross. In order to effectively use social media to support such missions, the Handbook suggests creating a centralized location to funnel information. This suggests that as the community of interest (COI) grows during a crisis, it will be important to ensure that information is shared with appropriate organizations for different aspects of the mission such as evacuation procedures, hospital sites, location of seaports and airports, and other topics relevant to the mission. For example, in the first 14 days of the U.S. Southern Command's Haiti HA/DR (Humanitarian Assistance/Disaster Relief) mission, the COI grew to over 1,900 users. In addition, operational conditions vary considerably among incidents, and coordination between different groups is often set up in an ad hoc manner. What is needed is a methodology that will help to find appropriate people with whom to share information for particular aspects of a mission during a wide range of events related to the mission. CNA has developed such a methodology and we would like to test it in a small scale lab experiment.

  9. Automated drusen detection in retinal images using analytical modelling algorithms

    Directory of Open Access Journals (Sweden)

    Manivannan Ayyakkannu

    2011-07-01

    Full Text Available Abstract Background Drusen are common features in the ageing macula associated with exudative Age-Related Macular Degeneration (ARMD. They are visible in retinal images and their quantitative analysis is important in the follow up of the ARMD. However, their evaluation is fastidious and difficult to reproduce when performed manually. Methods This article proposes a methodology for Automatic Drusen Deposits Detection and quantification in Retinal Images (AD3RI by using digital image processing techniques. It includes an image pre-processing method to correct the uneven illumination and to normalize the intensity contrast with smoothing splines. The drusen detection uses a gradient based segmentation algorithm that isolates drusen and provides basic drusen characterization to the modelling stage. The detected drusen are then fitted by Modified Gaussian functions, producing a model of the image that is used to evaluate the affected area. Twenty two images were graded by eight experts, with the aid of a custom made software and compared with AD3RI. This comparison was based both on the total area and on the pixel-to-pixel analysis. The coefficient of variation, the intraclass correlation coefficient, the sensitivity, the specificity and the kappa coefficient were calculated. Results The ground truth used in this study was the experts' average grading. In order to evaluate the proposed methodology three indicators were defined: AD3RI compared to the ground truth (A2G; each expert compared to the other experts (E2E and a standard Global Threshold method compared to the ground truth (T2G. The results obtained for the three indicators, A2G, E2E and T2G, were: coefficient of variation 28.8 %, 22.5 % and 41.1 %, intraclass correlation coefficient 0.92, 0.88 and 0.67, sensitivity 0.68, 0.67 and 0.74, specificity 0.96, 0.97 and 0.94, and kappa coefficient 0.58, 0.60 and 0.49, respectively. Conclusions The gradings produced by AD3RI obtained an agreement

  10. Selected methods of waste monitoring using modern analytical techniques

    International Nuclear Information System (INIS)

    Hlavacek, I.; Hlavackova, I.

    1993-11-01

    Issues of the inspection and control of bituminized and cemented waste are discussed, and some methods of their nondestructive testing are described. Attention is paid to the inspection techniques, non-nuclear spectral techniques in particular, as employed for quality control of the wastes, waste concentrates, spent waste leaching solutions, as well as for the examination of environmental samples (waters and soils) from the surroundings of nuclear power plants. Some leaching tests used abroad for this purpose and practical analyses by the ICP-AES technique are given by way of example. The ICP-MS technique, which is unavailable in the Czech Republic, is routinely employed abroad for alpha nuclide measurements; examples of such analyses are also given. The next topic discussed includes the monitoring of organic acids and complexants to determine the degree of their thermal decomposition during the bituminization of wastes on an industrial line. All of the methods and procedures highlighted can be used as technical support during the monitoring of radioactive waste properties in industrial conditions, in the chemical and radiochemical analyses of wastes and related matter, in the calibration of nondestructive testing instrumentation, in the monitoring of contamination of the surroundings of nuclear facilities, and in trace analysis. (author). 10 tabs., 1 fig., 14 refs

  11. Quality improvement of biodiesel blends using different promising fuel additives to reduce fuel consumption and NO emission from CI engine

    International Nuclear Information System (INIS)

    Imdadul, H.K.; Rashed, M.M.; Shahin, M.M.; Masjuki, H.H.; Kalam, M.A.; Kamruzzaman, M.; Rashedul, H.K.

    2017-01-01

    Highlights: • Pentanol, EHN and DTBP are promising fuel additives for improving properties of biodiesel blends. • The utilization of additives improved the properties such as the cetane number, viscosity and oxidation stability. • BSFC, NO and smoke of the EHN and DTBP treated blends are improved by the addition of fuel additives. • Cylinder pressure and Heat Release Rate are enhanced with EHN and DTBP addition. - Abstract: Considering the low cetane number of biodiesel blends and alcohols, ignition promoter additives 2-ethylhexyl nitrate (EHN) and di-tertiary-butyl peroxide (DTBP) was used in this study at a proportion of 1000 and 2000 ppm to diesel-biodiesel-pentanol blends. Five carbon pentanol was used at a proportion of 10% with 20% jatropha biodiesel-70% diesel blends and engine testing was carried out in a single cylinder DI diesel engine. The fuel properties, engine performance, emission and combustion were studied and mainly the effects of two most widely used ignition promoter on the engine behaviour were compared and analyzed. Experimental results indicated that, the fuel properties like density (0.36–1.45%), viscosity (0.26–3.77%), oxidation stability (5.5–26.4%), cetane number (2–14.58%) are improved remarkably with a moderate change in calorific value for the pentanol and ignition promoter treated biodiesel blends depending on the proportion used and for different benchmark. The brake power (BP) is developed very slightly (0.66–1.52%), which is still below than that of diesel, however, the brake specific energy consumption (BSEC) decreased significantly (0.92–5.84%). Although mixing of pentanol increased the nitric oxide (NO) (2.15% than JB20) with reducing the hydrocarbon (HC), carbon monoxide (CO) and smoke, however, the addition of EHN and DTBP reduced the NO (2–4.62%) and smoke (3.45–15.5%) emissions showing higher CO (1.3–9.15%) and HC (5.1–17.87%) emission based on percentage of ignition promoter used. The NO emission

  12. Design of Magnetic Charged Particle Lens Using Analytical Potential Formula

    Science.gov (United States)

    Al-Batat, A. H.; Yaseen, M. J.; Abbas, S. R.; Al-Amshani, M. S.; Hasan, H. S.

    2018-05-01

    In the current research was to benefit from the potential of the two cylindrical electric lenses to be used in the product a mathematical model from which, one can determine the magnetic field distribution of the charged particle objective lens. With aid of simulink in matlab environment, some simulink models have been building to determine the distribution of the target function and their related axial functions along the optical axis of the charged particle lens. The present study showed that the physical parameters (i.e., the maximum value, Bmax, and the half width W of the field distribution) and the objective properties of the charged particle lens have been affected by varying the main geometrical parameter of the lens named the bore radius R.

  13. Screening for triterpenoid saponins in plants using hyphenated analytical platforms

    DEFF Research Database (Denmark)

    Khakimov, Bekzod; Tseng, Li Hong; Godejohann, Markus

    2016-01-01

    of saponin profiles from intact plant extracts as well as saponin aglycone profiles from hydrolysed samples. Continuously measured 1D proton NMR data during LC separation along with mass spectrometry data revealed significant differences, including contents of saponins, types of aglycones and numbers......Recently the number of studies investigating triterpenoid saponins has drastically increased due to their diverse and potentially attractive biological activities. Currently the literature contains chemical structures of few hundreds of triterpenoid saponins of plant and animal origin. Triterpenoid...... saponins consist of a triterpene aglycone with one or more sugar moieties attached to it. However, due to similar physico-chemical properties, isolation and identification of a large diversity of triterpenoid saponins remain challenging. This study demonstrates a methodology to screen saponins using...

  14. Discovering Hidden Controlling Parameters using Data Analytics and Dimensional Analysis

    Science.gov (United States)

    Del Rosario, Zachary; Lee, Minyong; Iaccarino, Gianluca

    2017-11-01

    Dimensional Analysis is a powerful tool, one which takes a priori information and produces important simplifications. However, if this a priori information - the list of relevant parameters - is missing a relevant quantity, then the conclusions from Dimensional Analysis will be incorrect. In this work, we present novel conclusions in Dimensional Analysis, which provide a means to detect this failure mode of missing or hidden parameters. These results are based on a restated form of the Buckingham Pi theorem that reveals a ridge function structure underlying all dimensionless physical laws. We leverage this structure by constructing a hypothesis test based on sufficient dimension reduction, allowing for an experimental data-driven detection of hidden parameters. Both theory and examples will be presented, using classical turbulent pipe flow as the working example. Keywords: experimental techniques, dimensional analysis, lurking variables, hidden parameters, buckingham pi, data analysis. First author supported by the NSF GRFP under Grant Number DGE-114747.

  15. The Use and Abuse of Limits of Detection in Environmental Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Richard J. C. Brown

    2008-01-01

    Full Text Available The limit of detection (LoD serves as an important method performance measure that is useful for the comparison of measurement techniques and the assessment of likely signal to noise performance, especially in environmental analytical chemistry. However, the LoD is only truly related to the precision characteristics of the analytical instrument employed for the analysis and the content of analyte in the blank sample. This article discusses how other criteria, such as sampling volume, can serve to distort the quoted LoD artificially and make comparison between various analytical methods inequitable. In order to compare LoDs between methods properly, it is necessary to state clearly all of the input parameters relating to the measurements that have been used in the calculation of the LoD. Additionally, the article discusses that the use of LoDs in contexts other than the comparison of the attributes of analytical methods, in particular when reporting analytical results, may be confusing, less informative than quoting the actual result with an accompanying statement of uncertainty, and may act to bias descriptive statistics.

  16. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  17. Predicting appointment misses in hospitals using data analytics

    Science.gov (United States)

    Karpagam, Sylvia; Ma, Nang Laik

    2017-01-01

    Background There is growing attention over the last few years about non-attendance in hospitals and its clinical and economic consequences. There have been several studies documenting the various aspects of non-attendance in hospitals. Project Predicting Appoint Misses (PAM) was started with the intention of being able to predict the type of patients that would not come for appointments after making bookings. Methods Historic hospital appointment data merged with “distance from hospital” variable was used to run Logistic Regression, Support Vector Machine and Recursive Partitioning to decide the contributing variables to missed appointments. Results Variables that are “class”, “time”, “demographics” related have an effect on the target variable, however, prediction models may not perform effectively due to very subtle influence on the target variable. Previously assumed major contributors like “age”, “distance” did not have a major effect on the target variable. Conclusions With the given data it will be very difficult to make any moderate/strong prediction of the Appointment misses. That being said with the help of the cut off we are able to capture all of the “appointment misses” in addition to also capturing the actualized appointments. PMID:28567409

  18. Tailoring of analytical performances of urea biosensors using nanomaterials

    International Nuclear Information System (INIS)

    Nouira, W; Barhoumi, H; Maaref, A; Renault, N Jaffrézic; Siadat, M

    2013-01-01

    This paper is a contribution to the study of enzymatic sensors based on nanoparticles of iron oxide (FeNPs). Urease enzyme was immobilized on FeNPs using layer-by-layer (LbL) deposition method. FeNPs were first coated with polyelectrolytes (PE): Poly (allylamine hydrochloride), PAH and Poly (sodium 4-styrenesulfonate), PSS for enzyme immobilization and then with enzyme. It has been confirmed through zeta potential measurements of FeNPs that the enzyme is immobilized on the surface. We evaluated the sensitivity of biosensors for urea by potentiometric and capacitive measurements on silicon / silica / FeNP-LBL-urease structures. The recorded capacity-potential curves (C-V) show a significant shift of flat band potential towards negative potentials in the presence of urea, the observed values of sensitivity vary between 30 and 40 mV/p[urea]. It has been shown that the proposed method for the immobilization of urease can increase the dynamic range of urea detection (10 −4 M to 10 −1 M) compared to the immobilization of urease without FeNP (10 −3.5 M to 10 −2.5 M). When the number of PAH-PSS layers was increased the sensitivity of detection was modified. This effect is due to partial inhibition of the enzyme in presence of FeNPs, which was shown by measurements in homogeneous phase.

  19. Radioimmunoassay of erythropoietin: analytical performance and clinical use in hematology.

    Science.gov (United States)

    Schlageter, M H; Toubert, M E; Podgorniak, M P; Najean, Y

    1990-10-01

    We report here the performance of a recently commercialized radioimmunoassay kit for determining erythropoietin (EPO) in serum or plasma. The lower detection limit of the method was 3 U/L. Precision, analyzed by the variation coefficients between different assay runs and in the same experiment, was always less than 10%; accuracy was assessed by recovery and dilution tests. In anemic patients (hematocrit 18-39%), the concentration of EPO was logarithmically related to hematocrit. A relatively large dispersion of the results was noted, as reported by others with various RIAs. Patients with severe renal failure demonstrated a very low EPO value, whatever the degree of their anemia. In some chronic anemias resulting from malignancy, EPO concentrations were also relatively low. In the polycythemia vera group, the EPO mean was below normal for greater than 95% of the patients, whatever their clinical stage (first evaluation, relapse, or remission). In contrast, 91% of the patients with pure erythrocytosis had a normal or increased EPO value, even when the etiology was unknown. Measurement of EPO concentration may be useful for the clinical differentiation of myeloproliferative disorders and, subsequently, for their prognosis and choice of treatment.

  20. Approximate Analytic Solutions for the Two-Phase Stefan Problem Using the Adomian Decomposition Method

    Directory of Open Access Journals (Sweden)

    Xiao-Ying Qin

    2014-01-01

    Full Text Available An Adomian decomposition method (ADM is applied to solve a two-phase Stefan problem that describes the pure metal solidification process. In contrast to traditional analytical methods, ADM avoids complex mathematical derivations and does not require coordinate transformation for elimination of the unknown moving boundary. Based on polynomial approximations for some known and unknown boundary functions, approximate analytic solutions for the model with undetermined coefficients are obtained using ADM. Substitution of these expressions into other equations and boundary conditions of the model generates some function identities with the undetermined coefficients. By determining these coefficients, approximate analytic solutions for the model are obtained. A concrete example of the solution shows that this method can easily be implemented in MATLAB and has a fast convergence rate. This is an efficient method for finding approximate analytic solutions for the Stefan and the inverse Stefan problems.

  1. Removal of heavy metals from polluted soil using the citric acid fermentation broth: a promising washing agent.

    Science.gov (United States)

    Zhang, Hongjiao; Gao, Yuntao; Xiong, Huabin

    2017-04-01

    The citric acid fermentation broth was prepared and it was employed to washing remediation of heavy metal-polluted soil. A well-defined washing effect was obtained, the removal percentages using citric acid fermentation broth are that 48.2% for Pb, 30.6% for Cu, 43.7% for Cr, and 58.4% for Cd and higher than that using citric acid solution. The kinetics of heavy metals desorption can be described by the double constant equation and Elovich equation and is a heterogeneous diffusion process. The speciation analysis shows that the citric acid fermentation broth can effectively reduce bioavailability and environmental risk of heavy metals. Spectroscopy characteristics analysis suggests that the washing method has only a small effect on the mineral composition and does not destroy the framework of soil system. Therefore, the citric acid fermentation broth is a promising washing agent and possesses a potential practical application value in the field of remediation of soils with a good washing performance.

  2. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    International Nuclear Information System (INIS)

    Ahmadkhaniha, Reza; Shafiee, Abbas; Rastkari, Noushin; Kobarfard, Farzad

    2009-01-01

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis

  3. Changes in Visual/Spatial and Analytic Strategy Use in Organic Chemistry with the Development of Expertise

    Science.gov (United States)

    Vlacholia, Maria; Vosniadou, Stella; Roussos, Petros; Salta, Katerina; Kazi, Smaragda; Sigalas, Michael; Tzougraki, Chryssa

    2017-01-01

    We present two studies that investigated the adoption of visual/spatial and analytic strategies by individuals at different levels of expertise in the area of organic chemistry, using the Visual Analytic Chemistry Task (VACT). The VACT allows the direct detection of analytic strategy use without drawing inferences about underlying mental…

  4. USING MOBILE PHONES TO PROMOTE LIFE SKILLS EDUCATION AMONG OPEN SCHOOLING STUDENTS: Promises, Possibilities, and Potential Strategies

    Directory of Open Access Journals (Sweden)

    Pradeep Kumar MISRA

    2013-07-01

    Full Text Available Across the globe, life skills education has been usually developed as part of a school initiative designed to support the healthy psychosocial development of children and adolescents. In other side, formal education system not always provides young people with good opportunities to become confident and realize their potentials. In this back drop, the biggest challenge is to identify the best strategies for providing effective life skills education to those many children who never attend secondary school or reach an age of high vulnerability and risk taking behaviour in the years immediately before reaching secondary school. Considering the situation that in different parts of the world, majority of the youths is having a mobile or will have a mobile soon, the researcher is of the view that mobile phones can be a viable option to offer life skills education to open schooling students coming from different cultural and social settings and backgrounds. Following this approach, present paper mainly discusses about: promises offered by mobile phones for life skills education; possibilities for using mobile phones as an effective, efficient and economical option for offering life skills education; and potential strategies to offer mobile phones supported life skills education to open schooling students.

  5. An approach to estimate spatial distribution of analyte within cells using spectrally-resolved fluorescence microscopy

    Science.gov (United States)

    Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam

    2017-03-01

    While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in

  6. Analytical solution using computer algebra of a biosensor for detecting toxic substances in water

    Science.gov (United States)

    Rúa Taborda, María. Isabel

    2014-05-01

    In a relatively recent paper an electrochemical biosensor for water toxicity detection based on a bio-chip as a whole cell was proposed and numerically solved and analyzed. In such paper the kinetic processes in a miniaturized electrochemical biosensor system was described using the equations for specific enzymatic reaction and the diffusion equation. The numerical solution shown excellent agreement with the measured data but such numerical solution is not enough to design efficiently the corresponding bio-chip. For this reason an analytical solution is demanded. The object of the present work is to provide such analytical solution and then to give algebraic guides to design the bio-sensor. The analytical solution is obtained using computer algebra software, specifically Maple. The method of solution is the Laplace transform, with Bromwich integral and residue theorem. The final solution is given as a series of Bessel functions and the effective time for the bio-sensor is computed. It is claimed that the analytical solutions that were obtained will be very useful to predict further current variations in similar systems with different geometries, materials and biological components. Beside of this the analytical solution that we provide is very useful to investigate the relationship between different chamber parameters such as cell radius and height; and electrode radius.

  7. Analytical characterization of high-level mixed wastes using multiple sample preparation treatments

    International Nuclear Information System (INIS)

    King, A.G.; Baldwin, D.L.; Urie, M.W.; McKinley, S.G.

    1994-01-01

    The Analytical Chemistry Laboratory at the Pacific Northwest Laboratory in Richland, Washington, is actively involved in performing analytical characterization of high-level mixed waste from Hanford's single shell and double shell tank characterization programs. A full suite of analyses is typically performed on homogenized tank core samples. These analytical techniques include inductively-coupled plasma-atomic emission spectroscopy, total organic carbon methods and radiochemistry methods, as well as many others, all requiring some type of remote sample-preparation treatment to solubilize the tank sludge material for analysis. Most of these analytical methods typically use a single sample-preparation treatment, inherently providing elemental information only. To better understand and interpret tank chemistry and assist in identifying chemical compounds, selected analytical methods are performed using multiple sample-preparation treatments. The sample preparation treatments used at Pacific Northwest Laboratory for this work with high-level mixed waste include caustic fusion, acid digestion, and water leach. The type of information available by comparing results from different sample-prep treatments includes evidence for the presence of refractory compounds, acid-soluble compounds, or water-soluble compounds. Problems unique to the analysis of Hanford tank wastes are discussed. Selected results from the Hanford single shell ferrocyanide tank, 241-C-109, are presented, and the resulting conclusions are discussed

  8. An analytically resolved model of a potato's thermal processing using Heun functions

    Science.gov (United States)

    Vargas Toro, Agustín.

    2014-05-01

    A potato's thermal processing model is solved analytically. The model is formulated using the equation of heat diffusion in the case of a spherical potato processed in a furnace, and assuming that the potato's thermal conductivity is radially modulated. The model is solved using the method of the Laplace transform, applying Bromwich Integral and Residue Theorem. The temperatures' profile in the potato is presented as an infinite series of Heun functions. All computations are performed with computer algebra software, specifically Maple. Using the numerical values of the thermal parameters of the potato and geometric and thermal parameters of the processing furnace, the time evolution of the temperatures in different regions inside the potato are presented analytically and graphically. The duration of thermal processing in order to achieve a specified effect on the potato is computed. It is expected that the obtained analytical results will be important in food engineering and cooking engineering.

  9. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    Science.gov (United States)

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  10. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  11. Ebselen, a useful tool for understanding cellular redox biology and a promising drug candidate for use in human diseases.

    Science.gov (United States)

    Noguchi, Noriko

    2016-04-01

    Ebselen is an organoselenium compound with glutathione peroxidase (GPx)-like hydroperoxide reducing activity. Moreover, ebselen has its own unique reactivity, with functions that GPx does not have, since it reacts with many kinds of thiols other than glutathione. Ebselen may affect the thioredoxin systems, through which it may contribute to regulation of cell function. With high reactivity toward thiols, hydroperoxides, and peroxynitrite, ebselen has been used as a useful tool in research on cellular redox mechanisms. Unlike α-tocopherol, ebselen does not scavenge lipid peroxyl radicals, which is another advantage of ebselen for use as a research tool in comparison with radical scavenging antioxidants. Selenium is not released from the ebselen molecule, which explains the low toxicity of ebselen. To further understand the mechanism of cellular redox biology, it should be interesting to compare the effects of ebselen with that of selenoprotein P, which supplies selenium to GPx. New medical applications of ebselen as a drug candidate for human diseases such as cancer and diabetes mellitus as well as brain stroke and ischemia will be expected. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Elicited vs. voluntary promises

    NARCIS (Netherlands)

    Ismayilov, H.; Potters, Jan

    2017-01-01

    We set up an experiment with pre-play communication to study the impact of promise elicitation by trustors from trustees on trust and trustworthiness. When given the opportunity a majority of trustors solicits a promise from the trustee. This drives up the promise making rate by trustees to almost

  13. Analytic confidence level calculations using the likelihood ratio and fourier transform

    International Nuclear Information System (INIS)

    Hu Hongbo; Nielsen, J.

    2000-01-01

    The interpretation of new particle search results involves a confidence level calculation on either the discovery hypothesis or the background-only ('null') hypothesis. A typical approach uses toy Monte Carlo experiments to build an expected experiment estimator distribution against which an observed experiment's estimator may be compared. In this note, a new approach is presented which calculates analytically the experiment estimator distribution via a Fourier transform, using the likelihood ratio as an ordering estimator. The analytic approach enjoys an enormous speed advantage over the toy Monte Carlo method, making it possible to quickly and precisely calculate confidence level results

  14. Determination of Lineaments of the Sea of Marmara using Normalized Derivatives and Analytic Signals

    International Nuclear Information System (INIS)

    Oruc, B.

    2007-01-01

    The normalized derivatives and analytic signals calculated from magnetic anomaly map present useful results for the structural interpretation. The effectiveness of the methods on the solutions of lineaments has been tested for the edges of the thin-plate model. In the field data, magnetic anomaly map observed in the middle section of Marmara Sea has been used. The approximate solutions have been obtained for the lineaments of the area related in North Anatolia Fault from the characteristic images of the normalized derivatives and horizontal derivative analytic signals

  15. The use of different analytical techniques as a backup to mineral resources assessment

    International Nuclear Information System (INIS)

    Carvalho Tofani, P. de; Ferreira, M.P.; Gomes, H.; Avelar, M.M.

    1982-01-01

    The Empresas Nucleares Brasileiras S.A. (NUCLEBRAS) has implemented and improved, since their foundation in 1974, several laboratories at the Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), in Belo Horizonte (MG, Brazil), in order to develop capabilities in the analytical chemistry field. Skillful personnel, using a large spectrum of equipment and procedures, is already able to determine, fast and accurately, almost any chemical element in any matrix. About 340.000 analytical determinations have been performed during the last seven years, concerning mostly chemical elements of great importance in the mineral technology programs. This considerable amount of results has been used, specially, as a backup to assess Brazilian uranium resources. (Author) [pt

  16. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  17. PROGRESSIVE DATA ANALYTICS IN HEALTH INFORMATICS USING AMAZON ELASTIC MAPREDUCE (EMR

    Directory of Open Access Journals (Sweden)

    J S Shyam Mohan

    2016-04-01

    Full Text Available Identifying, diagnosing and treatment of cancer involves a thorough investigation that involves data collection called big data from multi and different sources that are helpful for making effective and quick decision making. Similarly data analytics is used to find remedial actions for newly arriving diseases spread across multiple warehouses. Analytics can be performed on collected or available data from various data clusters that contains pieces of data. We provide an effective framework that provides a way for effective decision making using Amazon EMR. Through various experiments done on different biological datasets, we reveal the advantages of the proposed model and present numerical results. These results indicate that the proposed framework can efficiently perform analytics over any biological datasets and obtain results in optimal time thereby maintaining the quality of the result.

  18. Library analytics and metrics using data to drive decisions and services

    CERN Document Server

    2015-01-01

    This book will enable libraries to make informed decisions, develop new services and improve user experience by collecting, analysing and utilising data. With the wealth of data available to library and information services, analytics are the key to understanding your users and your field of operations better and improving the services that you offer. This book sets out the opportunities that analytics present to libraries, and provides inspiration for how they can use the data within their systems to help inform decisions and drive services. Using case studies to provide real-life examples of current developments and services, and packed full of practical advice and guidance for libraries looking to realise the value of their data, this will be an essential guide for librarians and information professionals. This volume will bring together a group of internationally recognised experts to explore some of the key issues in the exploitation of data analytics and metrics in the library and cultural heritage sect...

  19. Magnetic anomaly depth and structural index estimation using different height analytic signals data

    Science.gov (United States)

    Zhou, Shuai; Huang, Danian; Su, Chao

    2016-09-01

    This paper proposes a new semi-automatic inversion method for magnetic anomaly data interpretation that uses the combination of analytic signals of the anomaly at different heights to determine the depth and the structural index N of the sources. The new method utilizes analytic signals of the original anomaly at different height to effectively suppress the noise contained in the anomaly. Compared with the other high-order derivative calculation methods based on analytic signals, our method only computes first-order derivatives of the anomaly, which can be used to obtain more stable and accurate results. Tests on synthetic noise-free and noise-corrupted magnetic data indicate that the new method can estimate the depth and N efficiently. The technique is applied to a real measured magnetic anomaly in Southern Illinois caused by a known dike, and the result is in agreement with the drilling information and inversion results within acceptable calculation error.

  20. Cardiac chamber volumes by echocardiography using a new mathematical method: A promising technique for zero-G use

    Science.gov (United States)

    Buckey, J. C.; Beattie, J. M.; Gaffney, F. A.; Nixon, J. V.; Blomqvist, C. G.

    1984-01-01

    Accurate, reproducible, and non-invasive means for ventricular volume determination are needed for evaluating cardiovascular function zero-gravity. Current echocardiographic methods, particularly for the right ventricle, suffer from a large standard error. A new mathematical approach, recently described by Watanabe et al., was tested on 1 normal formalin-fixed human hearts suspended in a mineral oil bath. Volumes are estimated from multiple two-dimensional echocardiographic views recorded from a single point at sequential angles. The product of sectional cavity area and center of mass for each view summed over the range of angles (using a trapezoidal rule) gives volume. Multiple (8-14) short axis right ventricle and left ventricle views at 5.0 deg intervals were videotaped. The images were digitized by two independent observers (leading-edge to leading-edge technique) and analyzed using a graphics tablet and microcomputer. Actual volumes were determined by filling the chambers with water. These data were compared to the mean of the two echo measurements.

  1. Identification of potentially safe promising fungal cell factories for the production of polyketide natural food colorants using chemotaxonomic rationale

    Directory of Open Access Journals (Sweden)

    Frisvad Jens C

    2009-04-01

    of chemotaxonomic tools and a priori knowledge of fungal extrolites is a rational approach towards selection of fungal polyketide pigment producers considering the enormous chemical diversity and biodiversity of ascomycetous fungi. This rationale could be very handy for the selection of potentially safe fungal cell factories not only for polyketide pigments but also for the other industrially important polyketides; the molecular and genetic basis for the biosynthesis of which has not yet been examined in detail. In addition, 4 out of the 10 chemotaxonomically selected promising Penicillium strains were shown to produce extracellular pigments in the liquid media using a solid support indicating future cell factory possibilities for polyketide natural food colorants.

  2. A Meta-Analytic Review of School-Based Prevention for Cannabis Use

    Science.gov (United States)

    Porath-Waller, Amy J.; Beasley, Erin; Beirness, Douglas J.

    2010-01-01

    This investigation used meta-analytic techniques to evaluate the effectiveness of school-based prevention programming in reducing cannabis use among youth aged 12 to 19. It summarized the results from 15 studies published in peer-reviewed journals since 1999 and identified features that influenced program effectiveness. The results from the set of…

  3. 76 FR 41747 - Protection of Stratospheric Ozone: Extension of Global Laboratory and Analytical Use Exemption...

    Science.gov (United States)

    2011-07-15

    ... these laboratory procedures would be permitted. In the supply chain, ODS distributors would not be able... risks. H. Executive Order 13211: Actions That Significantly Affect Energy Supply, Distribution, or Use... laboratory and analytical uses that have not been already identified by EPA as nonessential. EPA is also...

  4. Understanding Customer Product Choices: A Case Study Using the Analytical Hierarchy Process

    Science.gov (United States)

    Robert L. Smith; Robert J. Bush; Daniel L. Schmoldt

    1996-01-01

    The Analytical Hierarchy Process (AHP) was used to characterize the bridge material selection decisions of highway officials across the United States. Understanding product choices by utilizing the AHP allowed us to develop strategies for increasing the use of timber in bridge construction. State Department of Transportation engineers, private consulting engineers, and...

  5. Using euhalophytes to understand salt tolerance and to develop saline agriculture: Suaeda salsa as a promising model.

    Science.gov (United States)

    Song, Jie; Wang, Baoshan

    2015-02-01

    As important components in saline agriculture, halophytes can help to provide food for a growing world population. In addition to being potential crops in their own right, halophytes are also potential sources of salt-resistance genes that might help plant breeders and molecular biologists increase the salt tolerance of conventional crop plants. One especially promising halophyte is Suaeda salsa, a euhalophytic herb that occurs both on inland saline soils and in the intertidal zone. The species produces dimorphic seeds: black seeds are sensitive to salinity and remain dormant in light under high salt concentrations, while brown seeds can germinate under high salinity (e.g. 600 mm NaCl) regardless of light. Consequently, the species is useful for studying the mechanisms by which dimorphic seeds are adapted to saline environments. S. salsa has succulent leaves and is highly salt tolerant (e.g. its optimal NaCl concentration for growth is 200 mm). A series of S. salsa genes related to salt tolerance have been cloned and their functions tested: these include SsNHX1, SsHKT1, SsAPX, SsCAT1, SsP5CS and SsBADH. The species is economically important because its fresh branches have high value as a vegetable, and its seed oil is edible and rich in unsaturated fatty acids. Because it can remove salts and heavy metals from saline soils, S. salsa can also be used in the restoration of salinized or contaminated saline land. Because of its economic and ecological value in saline agriculture, S. salsa is one of the most important halophytes in China. In this review, the value of S. salsa as a source of food, medicine and forage is discussed. Its uses in the restoration of salinized or contaminated land and as a source of salt-resistance genes are also considered. © The Author 2014. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Multifunctional nanoparticles: Analytical prospects

    International Nuclear Information System (INIS)

    Dios, Alejandro Simon de; Diaz-Garcia, Marta Elena

    2010-01-01

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifuncional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  7. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Navigating the Benford Labyrinth: A big-data analytic protocol illustrated using the academic library context

    Directory of Open Access Journals (Sweden)

    Michael Halperin

    2016-03-01

    Full Text Available Objective: Big Data Analytics is a panoply of techniques the principal intention of which is to ferret out dimensions or factors from certain data streamed or available over the WWW. We offer a subset or “second” stage protocol of Big Data Analytics (BDA that uses these dimensional datasets as benchmarks for profiling related data. We call this Specific Context Benchmarking (SCB. Method: In effecting this benchmarking objective, we have elected to use a Digital Frequency Profiling (DFP technique based upon the work of Newcomb and Benford, who have developed a profiling benchmark based upon the Log10 function. We illustrate the various stages of the SCB protocol using the data produced by the Academic Research Libraries to enhance insights regarding the details of the operational benchmarking context and so offer generalizations needed to encourage adoption of SCB across other functional domains. Results: An illustration of the SCB protocol is offered using the recently developed Benford Practical Profile as the Conformity Benchmarking Measure. ShareWare: We have developed a Decision Support System called: SpecificContextAnalytics (SCA:DSS to create the various information sets presented in this paper. The SCA:DSS, programmed in Excel VBA, is available from the corresponding author as a free download without restriction to its use. Conclusions: We note that SCB effected using the DFPs is an enhancement not a replacement for the usual statistical and analytic techniques and fits very well in the BDA milieu.

  9. Land-use regime shifts: an analytical framework and agenda for future land-use research

    Directory of Open Access Journals (Sweden)

    Navin Ramankutty

    2016-06-01

    Full Text Available A key research frontier in global change research lies in understanding processes of land change to inform predictive models of future land states. We believe that significant advances in the field are hampered by limited attention being paid to critical points of change termed land-use regime shifts. We present an analytical framework for understanding land-use regime shifts. We survey historical events of land change and perform in-depth case studies of soy and shrimp development in Latin America to demonstrate the role of preconditions, triggers, and self-reinforcing processes in driving land-use regime shifts. Whereas the land-use literature demonstrates a good understanding of within-regime dynamics, our understanding of the drivers of land-use regime shifts is limited to ex post facto explications. Theoretical and empirical advances are needed to better understand the dynamics and implications of land-use regime shifts. We draw insights from the regime-shifts literature to propose a research agenda for studying land change.

  10. Behavioural effects of advanced cruise control use : a meta-analytic approach.

    NARCIS (Netherlands)

    Dragutinovic, N. Brookhuis, K.A. Hagenzieker, M.P. & Marchau, V.A.W.J.

    2006-01-01

    In this study, a meta-analytic approach was used to analyse effects of Advanced Cruise Control (ACC) on driving behaviour reported in seven driving simulator studies. The effects of ACC on three consistent outcome measures, namely, driving speed, headway and driver workload have been analysed. The

  11. Using Photocatalytic Oxidation and Analytic Techniques to Remediate Lab Wastewater Containing Methanol

    Science.gov (United States)

    Xiong, Qing; Luo, Mingliang; Bao, Xiaoming; Deng, Yurong; Qin, Song; Pu, Xuemei

    2018-01-01

    This experiment is dedicated to second-year and above undergraduates who are in their experimental session of the analytical chemistry course. Grouped students are required to use a TiO[subscript 2] photocatalytic oxidation process to treat the methanol-containing wastewater that resulted from their previous HPLC experiments. Students learn to…

  12. Exploring maintenance policy selection using the Analytic Hierarchy Process : an application for naval ships

    NARCIS (Netherlands)

    Goossens, A.J.M.; Basten, R.J.I.

    2015-01-01

    In this paper we investigate maintenance policy selection (MPS) through the use of the Analytic Hierarchy Process (AHP). A maintenance policy is a policy that dictates which parameter triggers a maintenance action. In practice, selecting the right maintenance policy appears to be a difficult

  13. Evaluating the Effectiveness of the Chemistry Education by Using the Analytic Hierarchy Process

    Science.gov (United States)

    Yüksel, Mehmet

    2012-01-01

    In this study, an attempt was made to develop a method of measurement and evaluation aimed at overcoming the difficulties encountered in the determination of the effectiveness of chemistry education based on the goals of chemistry education. An Analytic Hierarchy Process (AHP), which is a multi-criteria decision technique, is used in the present…

  14. Orthogonal Higher Order Structure of the WISC-IV Spanish Using Hierarchical Exploratory Factor Analytic Procedures

    Science.gov (United States)

    McGill, Ryan J.; Canivez, Gary L.

    2016-01-01

    As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…

  15. Using the Analytic Hierarchy Process for Decision-Making in Ecosystem Management

    Science.gov (United States)

    Daniel L. Schmoldt; David L. Peterson

    1997-01-01

    Land management activities on public lands combine multiple objectives in order to create a plan of action over a finite time horizon. Because management activities are constrained by time and money, it is critical to make the best use of available agency resources. The Analytic Hierarchy Process (AHP) offers a structure for multi-objective decisionmaking so that...

  16. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Science.gov (United States)

    2010-04-01

    ...; (2) Clinical laboratories regulated under the Clinical Laboratory Improvement Amendments of 1988 (CLIA), as qualified to perform high complexity testing under 42 CFR part 493 or clinical laboratories... analytical or clinical performance. (e) The laboratory that develops an in-house test using the ASR shall...

  17. Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective

    Science.gov (United States)

    Dietz-Uhler, Beth; Hurn, Janet E.

    2013-01-01

    Learning analytics is receiving increased attention, in part because it offers to assist educational institutions in increasing student retention, improving student success, and easing the burden of accountability. Although these large-scale issues are worthy of consideration, faculty might also be interested in how they can use learning analytics…

  18. The Use of the Analytic Hierarchy Process to Aid Decision Making in Acquired Equinovarus Deformity

    NARCIS (Netherlands)

    van Til, Janine Astrid; Renzenbrink, G.J.; Dolan, J.G.; IJzerman, Maarten Joost

    2008-01-01

    Objective: To increase the transparency of decision making about treatment in patients with equinovarus deformity poststroke. - Design: The analytic hierarchy process (AHP) was used as a structured methodology to study the subjective rationale behind choice of treatment. - Setting: An 8-hour meeting

  19. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  20. Pre-analytical and analytical variation of drug determination in segmented hair using ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Johansen, Sys Stybe; Linnet, Kristian

    2014-01-01

    variation was estimated to be less than 15% for almost all compounds. The method was successfully applied to 25 segmented hair specimens from deceased drug addicts showing a broad pattern of poly-drug use. The pre-analytical sampling variation was estimated from the genuine duplicate measurements of two...

  1. On application of analytical transformation system using a computer for Feynman intearal calculation

    International Nuclear Information System (INIS)

    Gerdt, V.P.

    1978-01-01

    Various systems of analytic transformations for the calculation of Feynman integrals using computers are discussed. The hyperspheric technique Which is used to calculate Feynman integrals enables to perform angular integration for a set of diagrams, thus reducing the multiplicity of integral. All calculations based on this method are made with the ASHMEDAL program. Feynman integrals are calculated in Euclidean space using integration by parts and some differential identities. Analytic calculation of Feynman integral is performed by the MACSYMA system. Dispersion method of integral calculation is implemented in the SCHOONSCHIP system, calculations based on features of Nielsen function are made using efficient SINAC and RSIN programs. A tube of basic Feynman integral parameters calculated using the above techniques is given

  2. assessment of concentration of air pollutants using analytical and numerical solution of the atmospheric diffusion equation

    International Nuclear Information System (INIS)

    Esmail, S.F.H.

    2011-01-01

    The mathematical formulation of numerous physical problems a results in differential equations actually partial or ordinary differential equations.In our study we are interested in solutions of partial differential equations.The aim of this work is to calculate the concentrations of the pollution, by solving the atmospheric diffusion equation(ADE) using different mathematical methods of solution. It is difficult to solve the general form of ADE analytically, so we use some assumptions to get its solution.The solutions of it depend on the eddy diffusivity profiles(k) and the wind speed u. We use some physical assumptions to simplify its formula and solve it. In the present work, we solve the ADE analytically in three dimensions using Green's function method, Laplace transform method, normal mode method and these separation of variables method. Also, we use ADM as a numerical method. Finally, comparisons are made with the results predicted by the previous methods and the observed data.

  3. Configuration and validation of an analytical model predicting secondary neutron radiation in proton therapy using Monte Carlo simulations and experimental measurements.

    Science.gov (United States)

    Farah, J; Bonfrate, A; De Marzi, L; De Oliveira, A; Delacroix, S; Martinetti, F; Trompier, F; Clairand, I

    2015-05-01

    This study focuses on the configuration and validation of an analytical model predicting leakage neutron doses in proton therapy. Using Monte Carlo (MC) calculations, a facility-specific analytical model was built to reproduce out-of-field neutron doses while separately accounting for the contribution of intra-nuclear cascade, evaporation, epithermal and thermal neutrons. This model was first trained to reproduce in-water neutron absorbed doses and in-air neutron ambient dose equivalents, H*(10), calculated using MCNPX. Its capacity in predicting out-of-field doses at any position not involved in the training phase was also checked. The model was next expanded to enable a full 3D mapping of H*(10) inside the treatment room, tested in a clinically relevant configuration and finally consolidated with experimental measurements. Following the literature approach, the work first proved that it is possible to build a facility-specific analytical model that efficiently reproduces in-water neutron doses and in-air H*(10) values with a maximum difference less than 25%. In addition, the analytical model succeeded in predicting out-of-field neutron doses in the lateral and vertical direction. Testing the analytical model in clinical configurations proved the need to separate the contribution of internal and external neutrons. The impact of modulation width on stray neutrons was found to be easily adjustable while beam collimation remains a challenging issue. Finally, the model performance agreed with experimental measurements with satisfactory results considering measurement and simulation uncertainties. Analytical models represent a promising solution that substitutes for time-consuming MC calculations when assessing doses to healthy organs. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  5. Hydraulic modeling of riverbank filtration systems with curved boundaries using analytic elements and series solutions

    Science.gov (United States)

    Bakker, Mark

    2010-08-01

    A new analytic solution approach is presented for the modeling of steady flow to pumping wells near rivers in strip aquifers; all boundaries of the river and strip aquifer may be curved. The river penetrates the aquifer only partially and has a leaky stream bed. The water level in the river may vary spatially. Flow in the aquifer below the river is semi-confined while flow in the aquifer adjacent to the river is confined or unconfined and may be subject to areal recharge. Analytic solutions are obtained through superposition of analytic elements and Fourier series. Boundary conditions are specified at collocation points along the boundaries. The number of collocation points is larger than the number of coefficients in the Fourier series and a solution is obtained in the least squares sense. The solution is analytic while boundary conditions are met approximately. Very accurate solutions are obtained when enough terms are used in the series. Several examples are presented for domains with straight and curved boundaries, including a well pumping near a meandering river with a varying water level. The area of the river bottom where water infiltrates into the aquifer is delineated and the fraction of river water in the well water is computed for several cases.

  6. Water Use Efficiency and Water Deficit Tolerance Indices in Terminal Growth Stages in Promising Bread Wheat genotypes

    Directory of Open Access Journals (Sweden)

    M. Nazeri

    2016-02-01

    Full Text Available Introduction During growth stages of wheat, anthesis and grain filling periods are the most susceptible to drought. Wheat cultivars that are more tolerant to terminal drought are more suitable to Mediterranean conditions. To increase water use efficiency, the target environment should be taken into account, because one trait might be effective in an environment but ineffective in another environment. In general, some traits like early vigour and root absorbtion capacity are so important in water deficient conditions. In recent years, increasing grain yield was due to increasing grain numbers. Although both the source and sink is considered as the limitation factors in increasing grain yield in old cultivars, even in the new cultivars sink seems to be more important. In fact, phenological adjustment adapted with seasonal precipitation pattern can improve water use efficiency in drought conditions. Suitable flowering time is the most important trait that is correlated with increasing water use efficiency in drought conditions. Materials and Methods In order to evaluate the level of drought tolerance in promising bread wheat lines, a split plot arrangements using randomized complete block design with three replications was carried out in 2008-09 and 2009-10 growing seasons at Torogh Agricultural Research Field Station, Mashhad. in. water limited conditions at three levels Optimum moisture conditions (L1, removal irrigation and using rain shelter from milky grain stage to maturity (L2, removal irrigation and using rainshelter from anthesis to maturity (L3 were assigned to main plots. Ten bread wheat lines include suitable for cold and dry regions (V1: (Toos, V2: (C-81-10, V3: (pishgam, V4: (C-84-4, V5: (C-84-8, V6: (C-D-85-15, V7: (C-D-85-9, V8: (C-D-84-5502, V9: (C-D-85-5502 and V10: (C-85-6 were randomized in sub-plots. Stress susceptibility index (SSI, stress tolerance index (STI and tolerance (TOL were calculated using following equations: D = 1

  7. Business Strategy Formulation By Shareholders and Company Management using The Analytical Network Process (ANPBusiness Strategy Formulation by Shareholders and Company Management Using Analytical Network Process (ANP

    Directory of Open Access Journals (Sweden)

    Faizal Faizal

    2016-11-01

    Full Text Available This research aimed to identify the business strategy formulation by the shareholders and the management of the company. Ten companies were selected to be the objects of this research. Those companies were the information technology, telecommunication, printing, mining, construction and chemical companies in Indonesia. The research was conducted by using the Analytical Network Process (ANP and considering the chosen respondents as the decision makers (experts of those companies. The respondents were chosen by using the non-probabilitty sampling method. The result shows that the roles of the company managements are considered m ore influental (0,57143 than the roles of the shareholders (0,28571. From the output of stakeholder’s condition, the best-stratified priority strategies are differentiation (0,600515, cost of leadership (0,230754 and focus (0,168731.

  8. Promising change, delivering continuity

    DEFF Research Database (Denmark)

    Lund, Jens Friis; Sungusia, Eliezeri; Mabele, Mathew Bukhi

    2017-01-01

    REDD+ is an ambition to reduce carbon emissions from deforestation and forest degradation in the Global South. This ambition has generated unprecedented commitment of political support and financial funds for the forest-development sector. Many academics and people-centered advocacy organizations...... have conceptualized REDD+ as an example of ‘‘green grabbing” and have voiced fears of a potential global rush for land and trees. In this paper we argue that, in practice and up until now, REDD+ resembles longstanding dynamics of the development and conservation industry, where the promise of change...... becomes a discursive commodity that is constantly reproduced and used to generate value and appropriate financial resources. We thus argue for a re-conceptualization of REDD+ as a conservation fad within the broader political economy of development and conservation. We derive this argument from a study...

  9. Quantification of process induced disorder in milled samples using different analytical techniques

    DEFF Research Database (Denmark)

    Zimper, Ulrike; Aaltonen, Jaakko; McGoverin, Cushla M.

    2012-01-01

    The aim of this study was to compare three different analytical methods to detect and quantify the amount of crystalline disorder/ amorphousness in two milled model drugs. X-ray powder diffraction (XRPD), differential scanning calorimetry (DSC) and Raman spectroscopy were used as analytical methods...... and indomethacin and simvastatin were chosen as the model compounds. These compounds partly converted from crystalline to disordered forms by milling. Partial least squares regression (PLS) was used to create calibration models for the XRPD and Raman data, which were subsequently used to quantify the milling......-induced crystalline disorder/ amorphousness under different process conditions. In the DSC measurements the change in heat capacity at the glass transition was used for quantification. Differently prepared amorphous indomethacin standards (prepared by either melt quench cooling or cryo milling) were compared...

  10. Keeping the Promise

    Science.gov (United States)

    Whissemore, Tabitha

    2016-01-01

    Since its launch in September 2015, Heads Up America has collected information on nearly 125 promise programs across the country, many of which were instituted long before President Barack Obama announced the America's College Promise (ACP) plan in 2015. At least 27 new free community college programs have launched in states, communities, and at…

  11. An analytical approach to characterize morbidity profile dissimilarity between distinct cohorts using electronic medical records

    OpenAIRE

    Schildcrout, Jonathan S.; Basford, Melissa A.; Pulley, Jill M.; Masys, Daniel R.; Roden, Dan M.; Wang, Deede; Chute, Christopher G.; Kullo, Iftikhar J.; Carrell, David; Peissig, Peggy; Kho, Abel; Denny, Joshua C.

    2010-01-01

    We describe a two-stage analytical approach for characterizing morbidity profile dissimilarity among patient cohorts using electronic medical records. We capture morbidities using the International Statistical Classification of Diseases and Related Health Problems (ICD-9) codes. In the first stage of the approach separate logistic regression analyses for ICD-9 sections (e.g., “hypertensive disease” or “appendicitis”) are conducted, and the odds ratios that describe adjusted differences in pre...

  12. Big data analytics for the virtual network topology reconfiguration use case

    OpenAIRE

    Gifre Renom, Lluís; Morales Alcaide, Fernando; Velasco Esteban, Luis Domingo; Ruiz Ramírez, Marc

    2016-01-01

    ABNO's OAM Handler is extended with big data analytics capabilities to anticipate traffic changes in volume and direction. Predicted traffic is used to trigger virtual network topology re-optimization. When the virtual topology needs to be reconfigured, predicted and current traffic matrices are used to find the optimal topology. A heuristic algorithm to adapt current virtual topology to meet both actual demands and expected traffic matrix is proposed. Experimental assessment is carried ou...

  13. Some questions of using coding theory and analytical calculation methods on computers

    International Nuclear Information System (INIS)

    Nikityuk, N.M.

    1987-01-01

    Main results of investigations devoted to the application of theory and practice of correcting codes are presented. These results are used to create very fast units for the selection of events registered in multichannel detectors of nuclear particles. Using this theory and analytical computing calculations, practically new combination devices, for example, parallel encoders, have been developed. Questions concerning the creation of a new algorithm for the calculation of digital functions by computers and problems of devising universal, dynamically reprogrammable logic modules are discussed

  14. Using Learning Analytics to Understand the Design of an Intelligent Language Tutor – Chatbot Lucy

    OpenAIRE

    Yi Fei Wang; Stephen Petrina

    2013-01-01

    the goal of this article is to explore how learning analytics can be used to predict and advise the design of an intelligent language tutor, chatbot Lucy. With its focus on using student-produced data to understand the design of Lucy to assist English language learning, this research can be a valuable component for language-learning designers to improve second language acquisition. In this article, we present students’ learning journey and data trails, the chatting log architecture and result...

  15. An Analytic Glossary to Social Inquiry Using Institutional and Political Activist Ethnography

    Directory of Open Access Journals (Sweden)

    Laura Bisaillon PhD

    2012-12-01

    Full Text Available This analytic glossary, composed of 52 terms, is a practical reference and working tool for persons preparing to conduct theoretically informed qualitative social science research drawing from institutional and political activist ethnography. Researchers using these approaches examine social problems and move beyond interpretation by explicating how these problems are organized and what social and ruling relations coordinate them. Political activist ethnography emerges from, and extends, institutional ethnography by producing knowledge explicitly for activism and social movement organizing ends. The assemblage of vocabulary and ideas in this word list are new, and build on existing methodological resources. This glossary offers an extensive, analytic, and challenging inventory of language that brings together terms from these ethnographic approaches with shared ancestry. This compilation is designed to serve as an accessible “one-stop-shop” resource for persons using or contemplating using institutional and political activist ethnography in their research and/or activist projects.

  16. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    Science.gov (United States)

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  17. AI based HealthCare Platform for Real Time, Predictive and Prescriptive Analytics using Reactive Programming

    Science.gov (United States)

    Kaur, Jagreet; Singh Mann, Kulwinder, Dr.

    2018-01-01

    AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.

  18. Solution of the isotopic depletion equation using decomposition method and analytical solution

    Energy Technology Data Exchange (ETDEWEB)

    Prata, Fabiano S.; Silva, Fernando C.; Martinez, Aquilino S., E-mail: fprata@con.ufrj.br, E-mail: fernando@con.ufrj.br, E-mail: aquilino@lmp.ufrj.br [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    In this paper an analytical calculation of the isotopic depletion equations is proposed, featuring a chain of major isotopes found in a typical PWR reactor. Part of this chain allows feedback reactions of (n,2n) type. The method is based on decoupling the equations describing feedback from the rest of the chain by using the decomposition method, with analytical solutions for the other isotopes present in the chain. The method was implemented in a PWR reactor simulation code, that makes use of the nodal expansion method (NEM) to solve the neutron diffusion equation, describing the spatial distribution of neutron flux inside the reactor core. Because isotopic depletion calculation module is the most computationally intensive process within simulation systems of nuclear reactor core, it is justified to look for a method that is both efficient and fast, with the objective of evaluating a larger number of core configurations in a short amount of time. (author)

  19. Analytical quality assurance in laboratories using tracers for biological and environmental studies

    International Nuclear Information System (INIS)

    Melaj, Mariana; Martin, Olga; Lopez, Silvia; Rojas de Tramontini, Susana

    1999-01-01

    This work describe the way we are organizing a quality assurance system to apply in the analytical measurements of the relation 14 N/ 15 N in biological and soil material. The relation 14 / 15 is measured with a optic emission spectrometer (NOI6PC), which distinguish the differences in wave length of electromagnetic radiation emitted by N-28, N-29 and N-30. The major problem is the 'cross contamination' of samples with different enrichments. The elements that are been considered to reach satisfactory analytical results are: 1) A proper working area; 2) The samples must be homogeneous and the samples must represent the whole sampled system; 3) The use of reference materials. In each digestion, a known reference sample must be added; 4) Adequate equipment operation; 5) Standard operating procedures; 6) Control charts, laboratory and equipment books. All operations using the equipment is registered in a book; 7) Training of the operators. (author)

  20. Analytical SN solutions in heterogeneous slabs using symbolic algebra computer programs

    International Nuclear Information System (INIS)

    Warsa, J.S.

    2002-01-01

    A modern symbolic algebra computer program, MAPLE, is used to compute solutions to the well-known analytical discrete ordinates, or S N , solutions in one-dimensional, slab geometry. Symbolic algebra programs compute the solutions with arbitrary precision and are free of spatial discretization error so they can be used to investigate new discretizations for one-dimensional slab, geometry S N methods. Pointwise scalar flux solutions are computed for several sample calculations of interest. Sample MAPLE command scripts are provided to illustrate how easily the theory can be translated into a working solution and serve as a complete tool capable of computing analytical S N solutions for mono-energetic, one-dimensional transport problems

  1. Analytical method used for intermediate products in continuous distillation of furfural

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.L.; Jia, M.; Wang, L.J.; Deng, Y.X.

    1981-01-01

    During distillation of furfural, analysis of main components in the crude furfural condensate and intermediate products is very important. Since furfural and methylfurfural are homologous and both furfural and acetone contain a carbonyl group, components in the sample must be separated before analysis. An improved analytical method has been studied, the accuracy and precision of which would meet the requirement of industrial standards. The analytical procedure was provided as follows: to determine the furfural content with gravimetric method of barbituric acid; to determine the methanol content with dichromate method after precipitating furfural and acetone, and distilling the liquid for analysis; and to determine the methylfurfural content with bromide-bromate method, which can be used only in the sample containing higher content of methylfurfural. For the sample in low content, the gas-liquid chromatographic method can be used. 7 references.

  2. Analytical implications of using practice theory in workplace information literacy research

    DEFF Research Database (Denmark)

    Moring, Camilla Elisabeth; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical...... focus and interest when researching workplace information literacy. Two practice theoretical perspectives are selected, one by Theodore Schatzki and one by Etienne Wenger, and their general commonalities and differences are analysed and discussed. Analysis: The two practice theories and their main ideas...... of what constitute practices, how practices frame social life and the central concepts used to explain this, are presented. Then the application of the theories within workplace information literacy research is briefly explored. Results and Conclusion: The two theoretical perspectives share some...

  3. Fluxball magnetic field analysis using a hybrid analytical/FEM/BEM with equivalent currents

    International Nuclear Information System (INIS)

    Fernandes, João F.P.; Camilo, Fernando M.; Machado, V. Maló

    2016-01-01

    In this paper, a fluxball electric machine is analyzed concerning the magnetic flux, force and torque. A novel method is proposed based in a special hybrid FEM/BEM (Finite Element Method/Boundary Element Method) with equivalent currents by using an analytical treatment for the source field determination. The method can be applied to evaluate the magnetic field in axisymmetric problems, in the presence of several magnetic materials. Same results obtained by a commercial Finite Element Analysis tool are presented for validation purposes with the proposed method. - Highlights: • The Fluxball machine magnetic field is analyzed by a new FEM/BEM/Analytical method. • The method is adequate for axisymmetric non homogeneous magnetic field problems. • The source magnetic field is evaluated considering a non-magnetic equivalent problem. • Material magnetization vectors are accounted by using equivalent currents. • A strong reduction of the finite element domain is achieved.

  4. Solution of the isotopic depletion equation using decomposition method and analytical solution

    International Nuclear Information System (INIS)

    Prata, Fabiano S.; Silva, Fernando C.; Martinez, Aquilino S.

    2011-01-01

    In this paper an analytical calculation of the isotopic depletion equations is proposed, featuring a chain of major isotopes found in a typical PWR reactor. Part of this chain allows feedback reactions of (n,2n) type. The method is based on decoupling the equations describing feedback from the rest of the chain by using the decomposition method, with analytical solutions for the other isotopes present in the chain. The method was implemented in a PWR reactor simulation code, that makes use of the nodal expansion method (NEM) to solve the neutron diffusion equation, describing the spatial distribution of neutron flux inside the reactor core. Because isotopic depletion calculation module is the most computationally intensive process within simulation systems of nuclear reactor core, it is justified to look for a method that is both efficient and fast, with the objective of evaluating a larger number of core configurations in a short amount of time. (author)

  5. Using Google Tag Manager and Google Analytics to track DSpace metadata fields as custom dimensions

    Directory of Open Access Journals (Sweden)

    Suzanna Conrad

    2015-01-01

    Full Text Available DSpace can be problematic for those interested in tracking download and pageview statistics granularly. Some libraries have implemented code to track events on websites and some have experimented with using Google Tag Manager to automate event tagging in DSpace. While these approaches make it possible to track download statistics, granular details such as authors, content types, titles, advisors, and other fields for which metadata exist are generally not tracked in DSpace or Google Analytics without coding. Moreover, it can be time consuming to track and assess pageview data and relate that data back to particular metadata fields. This article will detail the learning process of incorporating custom dimensions for tracking these detailed fields including trial and error attempts to use the data import function manually in Google Analytics, to automate the data import using Google APIs, and finally to automate the collection of dimension data in Google Tag Manager by mimicking SEO practices for capturing meta tags. This specific case study refers to using Google Tag Manager and Google Analytics with DSpace; however, this method may also be applied to other types of websites or systems.

  6. Description and principles of use of an automatic control device usable, in particular, in analytical chemistry

    International Nuclear Information System (INIS)

    Rigaudiere, Roger; Jeanmaire, Lucien

    1969-01-01

    This note describes an automatic control device for the programming of about 20 different functions, chronologically and during a given time. Any voltage can be chosen at the output to perform the different functions. Three examples of utilisation taken in analytical chemistry are given to illustrate the possibilities offered by this device, but its domain of use is much more universal and independent of the type of functions [fr

  7. Analytical calculation of the average scattering cross sections using fourier series

    International Nuclear Information System (INIS)

    Palma, Daniel A.P.; Goncalves, Alessandro C.; Martinez, Aquilino S.; Silva, Fernando C. da

    2009-01-01

    The precise determination of the Doppler broadening functions is very important in different applications of reactors physics, mainly in the processing of nuclear data. Analytical approximations are obtained in this paper for average scattering cross section using expansions in Fourier series, generating an approximation that is simple and precise. The results have shown to be satisfactory from the point-of-view of accuracy and do not depend on the type of resonance considered. (author)

  8. A New Class of Analytic Functions Defined by Using Salagean Operator

    Directory of Open Access Journals (Sweden)

    R. M. El-Ashwah

    2013-01-01

    Full Text Available We derive some results for a new class of analytic functions defined by using Salagean operator. We give some properties of functions in this class and obtain numerous sharp results including for example, coefficient estimates, distortion theorem, radii of star-likeness, convexity, close-to-convexity, extreme points, integral means inequalities, and partial sums of functions belonging to this class. Finally, we give an application involving certain fractional calculus operators that are also considered.

  9. Analytical calculation of the average scattering cross sections using fourier series

    Energy Technology Data Exchange (ETDEWEB)

    Palma, Daniel A.P. [Instituto Federal do Rio de Janeiro, Nilopolis, RJ (Brazil)], e-mail: dpalmaster@gmail.com; Goncalves, Alessandro C.; Martinez, Aquilino S.; Silva, Fernando C. da [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear], e-mail: asilva@con.ufrj.br, e-mail: agoncalves@con.ufrj.br, e-mail: aquilino@lmp.ufrj.br, e-mail: fernando@con.ufrj.br

    2009-07-01

    The precise determination of the Doppler broadening functions is very important in different applications of reactors physics, mainly in the processing of nuclear data. Analytical approximations are obtained in this paper for average scattering cross section using expansions in Fourier series, generating an approximation that is simple and precise. The results have shown to be satisfactory from the point-of-view of accuracy and do not depend on the type of resonance considered. (author)

  10. Analytical Method Used to Calculate Pile Foundations with the Widening Up on a Horizontal Static Impact

    Science.gov (United States)

    Kupchikova, N. V.; Kurbatskiy, E. N.

    2017-11-01

    This paper presents a methodology for the analytical research solutions for the work pile foundations with surface broadening and inclined side faces in the ground array, based on the properties of Fourier transform of finite functions. The comparative analysis of the calculation results using the suggested method for prismatic piles, piles with surface broadening prismatic with precast piles and end walls with precast wedges on the surface is described.

  11. Groundwater Seepage Estimation into Amirkabir Tunnel Using Analytical Methods and DEM and SGR Method

    OpenAIRE

    Hadi Farhadian; Homayoon Katibeh

    2015-01-01

    In this paper, groundwater seepage into Amirkabir tunnel has been estimated using analytical and numerical methods for 14 different sections of the tunnel. Site Groundwater Rating (SGR) method also has been performed for qualitative and quantitative classification of the tunnel sections. The obtained results of above mentioned methods were compared together. The study shows reasonable accordance with results of the all methods unless for two sections of tunnel. In these t...

  12. Renewable energy integration in smart grids-multicriteria assessment using the fuzzy analytical hierarchy process

    OpenAIRE

    JANJIC, ALEKSANDAR; SAVIC, SUZANA; VELIMIROVIC, LAZAR; NIKOLIC, VESNA

    2015-01-01

    Unlike the traditional way of efficiency assessment of renewable energy sources integration, the smart grid concept is introducing new goals and objectives regarding increased use of renewable electricity sources, grid security, energy conservation, energy efficiency, and deregulated energy market. Possible benefits brought by renewable sources integration are evaluated by the degree of the approach to the ideal smart grid. In this paper, fuzzy analytical hierarchy process methodology for the...

  13. The use of nuclear analytical methods in the investigation of objects of art and historical monuments

    International Nuclear Information System (INIS)

    Janovsky, I.

    2006-01-01

    Special nuclear analytical methods contribute significantly to the identification of the origin, manufacturing technology and/or authenticity of objects of art and historical monuments, Such methods primarily include variants of X-ray fluorescence analysis and activation analysis. The former enables non/destructive testing of materials, the latter features a high sensitivity. The article presents numerous examples of use of such methods especially in the Czech Republic (or former Czechoslovakia). (author)

  14. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  15. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  16. Decision support for selecting exportable nuclear technology using the analytic hierarchy process: A Korean case

    International Nuclear Information System (INIS)

    Lee, Deok Joo; Hwang, Jooho

    2010-01-01

    The Korean government plans to increase strategically focused R and D investment in some promising nuclear technology areas to create export opportunities of technology in a global nuclear market. The purpose of this paper is to present a decision support process for selecting promising nuclear technology with the perspective of exportability by using the AHP based on extensive data gathered from nuclear experts in Korea. In this study, the decision criteria for evaluating the export competitiveness of nuclear technologies were determined, and a hierarchical structure for the decision-making process was systematically developed. Subsequently relative weights of decision criteria were derived using AHP methodology and the export competitiveness of nuclear technology alternatives was quantified to prioritize them. We discuss the implications of our results with a viewpoint toward national nuclear technology policy.

  17. The legal and ethical concerns that arise from using complex predictive analytics in health care.

    Science.gov (United States)

    Cohen, I Glenn; Amarasingham, Ruben; Shah, Anand; Xie, Bin; Lo, Bernard

    2014-07-01

    Predictive analytics, or the use of electronic algorithms to forecast future events in real time, makes it possible to harness the power of big data to improve the health of patients and lower the cost of health care. However, this opportunity raises policy, ethical, and legal challenges. In this article we analyze the major challenges to implementing predictive analytics in health care settings and make broad recommendations for overcoming challenges raised in the four phases of the life cycle of a predictive analytics model: acquiring data to build the model, building and validating it, testing it in real-world settings, and disseminating and using it more broadly. For instance, we recommend that model developers implement governance structures that include patients and other stakeholders starting in the earliest phases of development. In addition, developers should be allowed to use already collected patient data without explicit consent, provided that they comply with federal regulations regarding research on human subjects and the privacy of health information. Project HOPE—The People-to-People Health Foundation, Inc.

  18. Just-in-time Time Data Analytics and Visualization of Climate Simulations using the Bellerophon Framework

    Science.gov (United States)

    Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.

    2015-12-01

    Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.

  19. Promise Zones for Applicants

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists applicants to HUD's Promise Zone initiative prepare data to submit with their application by allowing applicants to draw the exact location of the...

  20. Analytical Prediction of the Spin Stabilized Satellite's Attitude Using The Solar Radiation Torque

    International Nuclear Information System (INIS)

    Motta, G B; Carvalho, M V; Zanardi, M C

    2013-01-01

    The aim of this paper is to present an analytical solution for the spin motion equations of spin-stabilized satellite considering only the influence of solar radiation torque. The theory uses a cylindrical satellite on a circular orbit and considers that the satellite is always illuminated. The average components of this torque were determined over an orbital period. These components are substituted in the spin motion equations in order to get an analytical solution for the right ascension and declination of the satellite spin axis. The time evolution for the pointing deviation of the spin axis was also analyzed. These solutions were numerically implemented and compared with real data of the Brazilian Satellite of Data Collection – SCD1 an SCD2. The results show that the theory has consistency and can be applied to predict the spin motion of spin-stabilized artificial satellites

  1. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Beaver, Justin M [ORNL; BogenII, Paul L. [Google Inc.; Drouhard, Margaret MEG G [ORNL; Pyle, Joshua M [ORNL

    2015-01-01

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing these contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.

  2. Study on possibility of plasma current profile determination using an analytical model of tokamak equilibrium

    International Nuclear Information System (INIS)

    Moriyama, Shin-ichi; Hiraki, Naoji

    1996-01-01

    The possibility of determining the current profile of tokamak plasma from the external magnetic measurements alone is investigated using an analytical model of tokamak equilibrium. The model, which is based on an approximate solution of the Grad-Shafranov equation, can set a plasma current profile expressed with four free parameters of the total plasma current, the poloidal beta, the plasma internal inductance and the axial safety factor. The analysis done with this model indicates that, for a D-shaped plasma, the boundary poloidal magnetic field prescribing the external magnetic field distribution is dependent on the axial safety factor in spite of keeping the boundary safety factor and the plasma internal inductance constant. This suggests that the plasma current profile is reversely determined from the external magnetic analysis. The possibility and the limitation of current profile determination are discussed through this analytical result. (author)

  3. Simulation of reactive geochemical transport in groundwater using a semi-analytical screening model

    Science.gov (United States)

    McNab, Walt W.

    1997-10-01

    A reactive geochemical transport model, based on a semi-analytical solution to the advective-dispersive transport equation in two dimensions, is developed as a screening tool for evaluating the impact of reactive contaminants on aquifer hydrogeochemistry. Because the model utilizes an analytical solution to the transport equation, it is less computationally intensive than models based on numerical transport schemes, is faster, and it is not subject to numerical dispersion effects. Although the assumptions used to construct the model preclude consideration of reactions between the aqueous and solid phases, thermodynamic mineral saturation indices are calculated to provide qualitative insight into such reactions. Test problems involving acid mine drainage and hydrocarbon biodegradation signatures illustrate the utility of the model in simulating essential hydrogeochemical phenomena.

  4. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  5. The Analytical Evaluation Of Three-Center Magnetic Multipole Moment Integrals By Using Slater Type Orbitals

    International Nuclear Information System (INIS)

    Oztekin, E.

    2010-01-01

    In this study, magnetic multipole moment integrals are calculated by using Slater type orbitals (STOs), Fourier transform and translation formulas. Firstly, multipole moment operators which appear in the three-center magnetic multipole moment integrals are translated to b-center from 0-center. So, three-center magnetic multipole moment integrals have been reduced to the two-center. Then, the obtained analytical expressions have been written in terms of overlap integrals. When the magnetic multipole moment integrals calculated, matrix representations for x-, y- and z-components of multipole moments was composed and every component was separately calculated to analytically. Consequently, magnetic multipole moment integrals are also given in terms of the same and different screening parameters.

  6. Designing for Student-Facing Learning Analytics

    Science.gov (United States)

    Kitto, Kirsty; Lupton, Mandy; Davis, Kate; Waters, Zak

    2017-01-01

    Despite a narrative that sees learning analytics (LA) as a field that aims to enhance student learning, few student-facing solutions have emerged. This can make it difficult for educators to imagine how data can be used in the classroom, and in turn diminishes the promise of LA as an enabler for encouraging important skills such as sense-making,…

  7. The Usefulness of Analytical Procedures - An Empirical Approach in the Auditing Sector in Portugal

    Directory of Open Access Journals (Sweden)

    Carlos Pinho

    2014-08-01

    Full Text Available The conceptual conflict between the efficiency and efficacy on financial auditing arises from the fact that resources are scarce, both in terms of the time available to carry out the audit and the quality and timeliness of the information available to the external auditor. Audits tend to be more efficient, the lower the combination of inherent risk and control risk is assessed to be, allowing the auditor to carry out less extensive and less timely auditing tests, meaning that in some cases analytical audit procedures are a good tool to support the opinions formed by the auditor. This research, by means of an empirical study of financial auditing in Portugal, aims to evaluate the extent to which analytical procedures are used during a financial audit engagement in Portugal, throughout the different phases involved in auditing. The conclusions point to the fact that, in general terms and regardless of the size of the audit company and the way in which professionals work, Portuguese auditors use analytical procedures more frequently during the planning phase rather than during the phase of evidence gathering and the phase of opinion formation.

  8. Evaluation of challenges of wood imports to Iran using Fuzzy Delphi Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    amin arian

    2017-08-01

    Full Text Available Abstract:Considering the increasing consumption of wood and wood products in Iran and limited domestic sources of wood and shortage of wood raw material in Iran, wood raw material imports is a solution for Iranian developing wood industries' wood procurement.But, wood imports to Iran, always faced with a lot of challenges. The aim of this research is to determine and evaluate the challenges in the way of wood imports to Iran. The research method used in this study is a descriptive-analytic method. The analytic method used in the study to evaluate the challenges is the Fuzzy Delphi Analytical Hierarchy Process (FDAHP. First the findings of previous researches in the field and the literature were studied and doing interviews with industry experts, the challenges in the way of wood imports to Iran were extracted and classified in 5 groups and 35 factors and were evaluated.The results shows that in the first level (groups the regulation, economic, politic, infrastructure and management groups have the most importance respectively. In second level (challenges, plant protection regulations have the most importance. After that, exchange rate tolerance, oil income, banking support and GDP have most importance respectively.

  9. The BTWorld use case for big data analytics : Description, MapReduce logical workflow, and empirical evaluation

    NARCIS (Netherlands)

    Hegeman, T.; Ghit, B.; Capota, M.; Hidders, A.J.H.; Epema, D.H.J.; Iosup, A.

    2013-01-01

    The commoditization of big data analytics, that is, the deployment, tuning, and future development of big data processing platforms such as MapReduce, relies on a thorough understanding of relevant use cases and workloads. In this work we propose BTWorld, a use case for time-based big data analytics

  10. Seamless Digital Environment – Plan for Data Analytics Use Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron Douglas [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able to provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to

  11. Simulation of an Electromagnetic Acoustic Transducer Array by Using Analytical Method and FDTD

    Directory of Open Access Journals (Sweden)

    Yuedong Xie

    2016-01-01

    Full Text Available Previously, we developed a method based on FEM and FDTD for the study of an Electromagnetic Acoustic Transducer Array (EMAT. This paper presents a new analytical solution to the eddy current problem for the meander coil used in an EMAT, which is adapted from the classic Deeds and Dodd solution originally intended for circular coils. The analytical solution resulting from this novel adaptation exploits the large radius extrapolation and shows several advantages over the finite element method (FEM, especially in the higher frequency regime. The calculated Lorentz force density from the analytical EM solver is then coupled to the ultrasonic simulations, which exploit the finite-difference time-domain (FDTD method to describe the propagation of ultrasound waves, in particular for Rayleigh waves. Radiation pattern obtained with Hilbert transform on time-domain waveforms is proposed to characterise the sensor in terms of its beam directivity and field distribution along the steering angle, which can produce performance parameters for an EMAT array, facilitating the optimum design of such sensors.

  12. Analysis and synthesis of bianisotropic metasurfaces by using analytical approach based on equivalent parameters

    Science.gov (United States)

    Danaeifar, Mohammad; Granpayeh, Nosrat

    2018-03-01

    An analytical method is presented to analyze and synthesize bianisotropic metasurfaces. The equivalent parameters of metasurfaces in terms of meta-atom properties and other specifications of metasurfaces are derived. These parameters are related to electric, magnetic, and electromagnetic/magnetoelectric dipole moments of the bianisotropic media, and they can simplify the analysis of complicated and multilayer structures. A metasurface of split ring resonators is studied as an example demonstrating the proposed method. The optical properties of the meta-atom are explored, and the calculated polarizabilities are applied to find the reflection coefficient and the equivalent parameters of the metasurface. Finally, a structure consisting of two metasurfaces of the split ring resonators is provided, and the proposed analytical method is applied to derive the reflection coefficient. The validity of this analytical approach is verified by full-wave simulations which demonstrate good accuracy of the equivalent parameter method. This method can be used in the analysis and synthesis of bianisotropic metasurfaces with different materials and in different frequency ranges by considering electric, magnetic, and electromagnetic/magnetoelectric dipole moments.

  13. Prioritizing the countries for BOT nuclear power project using Analytic Hierarchy Process

    International Nuclear Information System (INIS)

    Choi, Sun Woo; Roh, Myung Sub

    2013-01-01

    This paper proposes factors influencing the success of BOT nuclear power projects and their weighting method using Analytic Hierarchy Process (AHP) to find the optimal country which developer intends to develop. To summarize, this analytic method enable the developer to select and focus on the country which has preferable circumstance so that it enhances the efficiency of the project promotion by minimizing the opportunity cost. Also, it enables the developer to quantify the qualitative factors so that it diversifies the project success strategy and policy for the targeted country. Although the performance of this study is insufficient due to the limitation of time, small sampling and security of materials, it still has the possibility to improve the analytic model more systematically through further study with more data. Developing Build-Own(or Operate)-Transfer (BOT) nuclear power project carrying large capital in the long term requires initially well-made multi-decision which it prevents sorts of risks from unexpected situation of targeted countries. Moreover, the nuclear power project in most case is practically implemented by Government to Government cooperation, so the key concern for such nuclear power project would be naturally focused on the country situation rather than project viability at planning stage. In this regard, it requires the evaluation of targeted countries before involving the project, comprehensive and proper decision making for complex judgment factors, and efficient integration of expert's opinions, etc. Therefore, prioritizing and evaluating the feasibility of country for identification of optimal project region is very meaningful study

  14. Use of analytical electron microscopy and auger electron spectroscopy for evaluating materials

    International Nuclear Information System (INIS)

    Jones, R.H.; Bruemmer, S.M.; Thomas, M.T.; Baer, D.R.

    1982-11-01

    Analytical electron microscopy (AEM) can be used to characterize the microstructure and microchemistry of materials over dimensions less than 10 nm while Auger electron spectroscopy (AES) can be used to characterize the chemical composition of surfaces and interfaces to a depth of less than 1 nm. Frequently, the information gained from both instruments can be coupled to give new insight into the behavior of materials. Examples of the use of AEM and AES to characterize segregation, sensitization and radiation damage are presented. A short description of the AEM and AES techniques are given

  15. Development of collaborative-creative learning model using virtual laboratory media for instrumental analytical chemistry lectures

    Science.gov (United States)

    Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian

    2017-08-01

    The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.

  16. Analytical and policy issues in energy economics: Uses of the FRS data base

    Science.gov (United States)

    1981-12-01

    The relevant literature concerning several major analytical and policy issues in energy economics is reviewed and criticized. The possible uses of the Financial Reporting System (FRS) data base for the analysis of energy policy issues are investigated. Certain features of FRS data suggest several ways in which the data base can be used by policy makers. FRS data are collected on the firm level, and different segments of the same firm operating in different markets can be separately identified. The methods of collection as well as FRS's elaborate data verification process guarantee a high degree of accuracy and consistency among firms.

  17. Analytical review of modern herbal medicines used in musculoskeletal system diseases

    Directory of Open Access Journals (Sweden)

    Анна Ігорівна Крюкова

    2015-10-01

    Full Text Available Effective and safe treatment of the musculoskeletal system diseases is one of the main branches of medicine in general and rheumatology in particular. The relevance of this problem is caused mainly by the high incidence in the population, and temporary and permanent work disability status development in patients. The duration of rheumatologic diseases necessitates the optimal regimen selection, providing effective treatment and helping to prevent potential side effects associated with long-term use of remedies.Aim of research. The aim of our research was to perform an analytical review of modern herbal products registered in Ukraine and used for musculoskeletal system treatment. The drug analysis was made according to next parameters: producing country, manufacturer, dosage form, and the origin of remedies (natural or synthetic.Methods. Conventional analytical studies of electronic and paper sources were used for realization of the given problem.Results. As a result of the analytical review of modern herbal remedies registered in Ukraine and used for musculoskeletal system treatment, it was found that 20 trade names of drugs, more than 90% of which are homeopathic, are displayed on the pharmaceutical market. Concerning dosage forms, pills (38,5 %, injection solutions and oral drops (23,1 % and 11,5 %, respectively gain the biggest market share.Conclusion. It was found that imported drugs are widely available (80 % on the analyzed market segment, while local remedies gain rather minor market share (about 20 %.Among medicines of this group presented on Ukrainian market, imported homeopathic remedies gain the biggest share. Phytotheurapeutic drugs gain minor market share and have limited composition of natural active ingredients represented by the extracts of Harpagophytum procumbens, Apium graveolens, Salix alba, and Zingiber officinale

  18. Consistent constitutive modeling of metallic target penetration using empirical, analytical, and numerical penetration models

    Directory of Open Access Journals (Sweden)

    John (Jack P. Riegel III

    2016-04-01

    Full Text Available Historically, there has been little correlation between the material properties used in (1 empirical formulae, (2 analytical formulations, and (3 numerical models. The various regressions and models may each provide excellent agreement for the depth of penetration into semi-infinite targets. But the input parameters for the empirically based procedures may have little in common with either the analytical model or the numerical model. This paper builds on previous work by Riegel and Anderson (2014 to show how the Effective Flow Stress (EFS strength model, based on empirical data, can be used as the average flow stress in the analytical Walker–Anderson Penetration model (WAPEN (Anderson and Walker, 1991 and how the same value may be utilized as an effective von Mises yield strength in numerical hydrocode simulations to predict the depth of penetration for eroding projectiles at impact velocities in the mechanical response regime of the materials. The method has the benefit of allowing the three techniques (empirical, analytical, and numerical to work in tandem. The empirical method can be used for many shot line calculations, but more advanced analytical or numerical models can be employed when necessary to address specific geometries such as edge effects or layering that are not treated by the simpler methods. Developing complete constitutive relationships for a material can be costly. If the only concern is depth of penetration, such a level of detail may not be required. The effective flow stress can be determined from a small set of depth of penetration experiments in many cases, especially for long penetrators such as the L/D = 10 ones considered here, making it a very practical approach. In the process of performing this effort, the authors considered numerical simulations by other researchers based on the same set of experimental data that the authors used for their empirical and analytical assessment. The goals were to establish a

  19. Analytical procedure in aseismic design of eccentric structure using response spectrum

    International Nuclear Information System (INIS)

    Takemori, T.; Kuwabara, Y.; Suwabe, A.; Mitsunobu, S.

    1977-01-01

    In this paper, the response are evaluated by the following two methods by the use of the typical torsional analytical models in which masses, rigidities, eccentricities between the centers thereof and several actual earthquake waves are taken as the parameters: (1) the root mean square of responses by using the response spectra derived from the earthquake waves, (2) the time history analysis by using the earthquake wave. The earthquake waves used are chosen to present the different frequency content and magnitude of the response spectra. The typical results derived from the study are as follows: (a) the response accelerations of mass center in the input earthquake direction by the (1) method coincide comparatively well with those by the (2) method, (b) the response accelerations perpendicular to the input earthquake direction by (1) method are 2 to 3 times as much as those by the (2) method, (c) the amplification of the response accelerations at arbitrary points distributed on the spread mass to those of center of the lumped mass by the (1) method are remarkably large compared with those by the (2) method in both directions respectively. These problems on the response spectrum analysis for the above-mentioned eccentric structure are discussed, and an improved analytical method applying the amplification coefficients of responses derived from this parametric time history analysis is proposed to the actual seismic design by the using of the given design ground response spectrum with root mean square technique

  20. Enantioselective Analytical- and Preparative-Scale Separation of Hexabromocyclododecane Stereoisomers Using Packed Column Supercritical Fluid Chromatography

    Directory of Open Access Journals (Sweden)

    Nicole Riddell

    2016-11-01

    Full Text Available Hexabromocyclododecane (HBCDD is an additive brominated flame retardant which has been listed in Annex A of the Stockholm Convention for elimination of production and use. It has been reported to persist in the environment and has the potential for enantiomer-specific degradation, accumulation, or both, making enantioselective analyses increasingly important. The six main stereoisomers of technical HBCDD (i.e., the (+ and (− enantiomers of α-, β-, and γ-HBCDD were separated and isolated for the first time using enantioselective packed column supercritical fluid chromatography (pSFC separation methods on a preparative scale. Characterization was completed using published chiral liquid chromatography (LC methods and elution profiles, as well as X-ray crystallography, and the isolated fractions were definitively identified. Additionally, the resolution of the enantiomers, along with two minor components of the technical product (δ- and ε-HBCDD, was investigated on an analytical scale using both LC and pSFC separation techniques, and changes in elution order were highlighted. Baseline separation of all HBCDD enantiomers was achieved by pSFC on an analytical scale using a cellulose-based column. The described method emphasizes the potential associated with pSFC as a green method of isolating and analyzing environmental contaminants of concern.

  1. Enantioselective Analytical- and Preparative-Scale Separation of Hexabromocyclododecane Stereoisomers Using Packed Column Supercritical Fluid Chromatography.

    Science.gov (United States)

    Riddell, Nicole; Mullin, Lauren Gayle; van Bavel, Bert; Ericson Jogsten, Ingrid; McAlees, Alan; Brazeau, Allison; Synnott, Scott; Lough, Alan; McCrindle, Robert; Chittim, Brock

    2016-11-10

    Hexabromocyclododecane (HBCDD) is an additive brominated flame retardant which has been listed in Annex A of the Stockholm Convention for elimination of production and use. It has been reported to persist in the environment and has the potential for enantiomer-specific degradation, accumulation, or both, making enantioselective analyses increasingly important. The six main stereoisomers of technical HBCDD (i.e., the (+) and (-) enantiomers of α-, β-, and γ-HBCDD) were separated and isolated for the first time using enantioselective packed column supercritical fluid chromatography (pSFC) separation methods on a preparative scale. Characterization was completed using published chiral liquid chromatography (LC) methods and elution profiles, as well as X-ray crystallography, and the isolated fractions were definitively identified. Additionally, the resolution of the enantiomers, along with two minor components of the technical product (δ- and ε-HBCDD), was investigated on an analytical scale using both LC and pSFC separation techniques, and changes in elution order were highlighted. Baseline separation of all HBCDD enantiomers was achieved by pSFC on an analytical scale using a cellulose-based column. The described method emphasizes the potential associated with pSFC as a green method of isolating and analyzing environmental contaminants of concern.

  2. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    Science.gov (United States)

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012

  3. Application and analytical verification of peracetic acid use in different types of freshwater aquaculture systems

    DEFF Research Database (Denmark)

    Pedersen, Lars-Flemming

    2011-01-01

    of water sanitation with PAA application were used to analytically verify actual PAA concentration under real conditions at different kinds of aquaculture systems. A characteristic instant disinfection demand was found to be significantly positively related to water COD content, and PAA half-lives were...... found to be in the order of a few minutes. The study revealed that PAA degrades so rapidly that insufficient disinfection is a likely outcome. The observations have applications for optimizing water treatment strategies with PAA. The investigations also indicated that the rapid degradation and hence...

  4. Approximate analytical solution of diffusion equation with fractional time derivative using optimal homotopy analysis method

    Directory of Open Access Journals (Sweden)

    S. Das

    2013-12-01

    Full Text Available In this article, optimal homotopy-analysis method is used to obtain approximate analytic solution of the time-fractional diffusion equation with a given initial condition. The fractional derivatives are considered in the Caputo sense. Unlike usual Homotopy analysis method, this method contains at the most three convergence control parameters which describe the faster convergence of the solution. Effects of parameters on the convergence of the approximate series solution by minimizing the averaged residual error with the proper choices of parameters are calculated numerically and presented through graphs and tables for different particular cases.

  5. Sensitivity analysis of technological, economic and sustainability evaluation of power plants using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Chatzimouratidis, Athanasios I.; Pilavachi, Petros A.

    2009-01-01

    Technological, economic and sustainability evaluation of power plants by use of the analytic hierarchy process and nine end node criteria for a reference scenario based on subjective criteria weighting has been presented in a previous paper by authors. However, criteria weight variations may substantially modify overall evaluations and rankings of power plants. The current paper presents a sensitivity analysis with four alternative scenarios (sets of criteria weights) compared with the reference scenario. The results show that priority to 'technology and sustainability' favors renewable energy power plants, while priority to 'economic' criteria favors mainly nuclear power plants and less the four types of fossil fuel power plant

  6. Design of laser-generated shockwave experiments. An approach using analytic models

    International Nuclear Information System (INIS)

    Lee, Y.T.; Trainor, R.J.

    1980-01-01

    Two of the target-physics phenomena which must be understood before a clean experiment can be confidently performed are preheating due to suprathermal electrons and shock decay due to a shock-rarefaction interaction. Simple analytic models are described for these two processes and the predictions of these models are compared with those of the LASNEX fluid physics code. We have approached this work not with the view of surpassing or even approaching the reliability of the code calculations, but rather with the aim of providing simple models which may be used for quick parameter-sensitivity evaluations, while providing physical insight into the problems

  7. Chinese Culture, Homosexuality Stigma, Social Support and Condom Use: A Path Analytic Model.

    Science.gov (United States)

    Liu, Hongjie; Feng, Tiejian; Ha, Toan; Liu, Hui; Cai, Yumao; Liu, Xiaoli; Li, Jian

    2011-01-01

    PURPOSE: The objective of this study was to examine the interrelationships among individualism, collectivism, homosexuality-related stigma, social support, and condom use among Chinese homosexual men. METHODS: A cross-sectional study using the respondent-driven sampling approach was conducted among 351 participants in Shenzhen, China. Path analytic modeling was used to analyze the interrelationships. RESULTS: The results of path analytic modeling document the following statistically significant associations with regard to homosexuality: (1) higher levels of vertical collectivism were associated with higher levels of public stigma [β (standardized coefficient) = 0.12] and self stigma (β = 0.12); (2) higher levels of vertical individualism were associated with higher levels self stigma (β = 0.18); (3) higher levels of horizontal individualism were associated with higher levels of public stigma (β = 0.12); (4) higher levels of self stigma were associated with higher levels of social support from sexual partners (β = 0.12); and (5) lower levels of public stigma were associated with consistent condom use (β = -0.19). CONCLUSIONS: The findings enhance our understanding of how individualist and collectivist cultures influence the development of homosexuality-related stigma, which in turn may affect individuals' decisions to engage in HIV-protective practices and seek social support. Accordingly, the development of HIV interventions for homosexual men in China should take the characteristics of Chinese culture into consideration.

  8. Human eye analytical and mesh-geometry models for ophthalmic dosimetry using MCNP6

    International Nuclear Information System (INIS)

    Angelocci, Lucas V.; Fonseca, Gabriel P.; Yoriyaz, Helio

    2015-01-01

    Eye tumors can be treated with brachytherapy using Co-60 plaques, I-125 seeds, among others materials. The human eye has regions particularly vulnerable to ionizing radiation (e.g. crystalline) and dosimetry for this region must be taken carefully. A mathematical model was proposed in the past [1] for the eye anatomy to be used in Monte Carlo simulations to account for dose distribution in ophthalmic brachytherapy. The model includes the description for internal structures of the eye that were not treated in previous works. The aim of this present work was to develop a new eye model based on the Mesh geometries of the MCNP6 code. The methodology utilized the ABAQUS/CAE (Simulia 3DS) software to build the Mesh geometry. For this work, an ophthalmic applicator containing up to 24 model Amersham 6711 I-125 seeds (Oncoseed) was used, positioned in contact with a generic tumor defined analytically inside the eye. The absorbed dose in eye structures like cornea, sclera, choroid, retina, vitreous body, lens, optical nerve and optical nerve wall were calculated using both models: analytical and MESH. (author)

  9. Human eye analytical and mesh-geometry models for ophthalmic dosimetry using MCNP6

    Energy Technology Data Exchange (ETDEWEB)

    Angelocci, Lucas V.; Fonseca, Gabriel P.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Eye tumors can be treated with brachytherapy using Co-60 plaques, I-125 seeds, among others materials. The human eye has regions particularly vulnerable to ionizing radiation (e.g. crystalline) and dosimetry for this region must be taken carefully. A mathematical model was proposed in the past [1] for the eye anatomy to be used in Monte Carlo simulations to account for dose distribution in ophthalmic brachytherapy. The model includes the description for internal structures of the eye that were not treated in previous works. The aim of this present work was to develop a new eye model based on the Mesh geometries of the MCNP6 code. The methodology utilized the ABAQUS/CAE (Simulia 3DS) software to build the Mesh geometry. For this work, an ophthalmic applicator containing up to 24 model Amersham 6711 I-125 seeds (Oncoseed) was used, positioned in contact with a generic tumor defined analytically inside the eye. The absorbed dose in eye structures like cornea, sclera, choroid, retina, vitreous body, lens, optical nerve and optical nerve wall were calculated using both models: analytical and MESH. (author)

  10. Chinese Culture, Homosexuality Stigma, Social Support and Condom Use: A Path Analytic Model

    Science.gov (United States)

    Liu, Hongjie; Feng, Tiejian; Ha, Toan; Liu, Hui; Cai, Yumao; Liu, Xiaoli; Li, Jian

    2011-01-01

    Purpose The objective of this study was to examine the interrelationships among individualism, collectivism, homosexuality-related stigma, social support, and condom use among Chinese homosexual men. Methods A cross-sectional study using the respondent-driven sampling approach was conducted among 351 participants in Shenzhen, China. Path analytic modeling was used to analyze the interrelationships. Results The results of path analytic modeling document the following statistically significant associations with regard to homosexuality: (1) higher levels of vertical collectivism were associated with higher levels of public stigma [β (standardized coefficient) = 0.12] and self stigma (β = 0.12); (2) higher levels of vertical individualism were associated with higher levels self stigma (β = 0.18); (3) higher levels of horizontal individualism were associated with higher levels of public stigma (β = 0.12); (4) higher levels of self stigma were associated with higher levels of social support from sexual partners (β = 0.12); and (5) lower levels of public stigma were associated with consistent condom use (β = −0.19). Conclusions The findings enhance our understanding of how individualist and collectivist cultures influence the development of homosexuality-related stigma, which in turn may affect individuals’ decisions to engage in HIV-protective practices and seek social support. Accordingly, the development of HIV interventions for homosexual men in China should take the characteristics of Chinese culture into consideration. PMID:21731850

  11. Analytical solutions to trade-offs between size of protected areas and land-use intensity.

    Science.gov (United States)

    Butsic, Van; Radeloff, Volker C; Kuemmerle, Tobias; Pidgeon, Anna M

    2012-10-01

    Land-use change is affecting Earth's capacity to support both wild species and a growing human population. The question is how best to manage landscapes for both species conservation and economic output. If large areas are protected to conserve species richness, then the unprotected areas must be used more intensively. Likewise, low-intensity use leaves less area protected but may allow wild species to persist in areas that are used for market purposes. This dilemma is present in policy debates on agriculture, housing, and forestry. Our goal was to develop a theoretical model to evaluate which land-use strategy maximizes economic output while maintaining species richness. Our theoretical model extends previous analytical models by allowing land-use intensity on unprotected land to influence species richness in protected areas. We devised general models in which species richness (with modified species-area curves) and economic output (a Cobb-Douglas production function) are a function of land-use intensity and the proportion of land protected. Economic output increased as land-use intensity and extent increased, and species richness responded to increased intensity either negatively or following the intermediate disturbance hypothesis. We solved the model analytically to identify the combination of land-use intensity and protected area that provided the maximum amount of economic output, given a target level of species richness. The land-use strategy that maximized economic output while maintaining species richness depended jointly on the response of species richness to land-use intensity and protection and the effect of land use outside protected areas on species richness within protected areas. Regardless of the land-use strategy, species richness tended to respond to changing land-use intensity and extent in a highly nonlinear fashion. ©2012 Society for Conservation Biology.

  12. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  13. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  14. Edge detection of magnetic anomalies using analytic signal of tilt angle (ASTA)

    Science.gov (United States)

    Alamdar, K.; Ansari, A. H.; Ghorbani, A.

    2009-04-01

    Magnetic is a commonly used geophysical technique to identify and image potential subsurface targets. Interpretation of magnetic anomalies is a complex process due to the superposition of multiple magnetic sources, presence of geologic and cultural noise and acquisition and positioning error. Both the vertical and horizontal derivatives of potential field data are useful; horizontal derivative, enhance edges whereas vertical derivative narrow the width of anomaly and so locate source bodies more accurately. We can combine vertical and horizontal derivative of magnetic field to achieve analytic signal which is independent to body magnetization direction and maximum value of this lies over edges of body directly. Tilt angle filter is phased-base filter and is defined as angle between vertical derivative and total horizontal derivative. Tilt angle value differ from +90 degree to -90 degree and its zero value lies over body edge. One of disadvantage of this filter is when encountering with deep sources the detected edge is blurred. For overcome this problem many authors introduced new filters such as total horizontal derivative of tilt angle or vertical derivative of tilt angle which Because of using high-order derivative in these filters results may be too noisy. If we combine analytic signal and tilt angle, a new filter termed (ASTA) is produced which its maximum value lies directly over body edge and is easer than tilt angle to delineate body edge and no complicity of tilt angle. In this work new filter has been demonstrated on magnetic data from an area in Sar- Cheshme region in Iran. This area is located in 55 degree longitude and 32 degree latitude and is a copper potential region. The main formation in this area is Andesith and Trachyandezite. Magnetic surveying was employed to separate the boundaries of Andezite and Trachyandezite from adjacent area. In this regard a variety of filters such as analytic signal, tilt angle and ASTA filter have been applied which

  15. Improvements in Off Design Aeroengine Performance Prediction Using Analytic Compressor Map Interpolation

    Science.gov (United States)

    Mist'e, Gianluigi Alberto; Benini, Ernesto

    2012-06-01

    Compressor map interpolation is usually performed through the introduction of auxiliary coordinates (β). In this paper, a new analytical bivariate β function definition to be used in compressor map interpolation is studied. The function has user-defined parameters that must be adjusted to properly fit to a single map. The analytical nature of β allows for rapid calculations of the interpolation error estimation, which can be used as a quantitative measure of interpolation accuracy and also as a valid tool to compare traditional β function interpolation with new approaches (artificial neural networks, genetic algorithms, etc.). The quality of the method is analyzed by comparing the error output to the one of a well-known state-of-the-art methodology. This comparison is carried out for two different types of compressor and, in both cases, the error output using the method presented in this paper is found to be consistently lower. Moreover, an optimization routine able to locally minimize the interpolation error by shape variation of the β function is implemented. Further optimization introducing other important criteria is discussed.

  16. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    Science.gov (United States)

    Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.

    2011-01-01

    The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.

  17. A geographic information system for gas power plant location using analytical hierarchy process and fuzzy logic

    International Nuclear Information System (INIS)

    Alavipoor, F. S.; Karimi, S.; Balist, J.; Khakian, A. H.

    2016-01-01

    This research recommends a geographic information system-based and multi-criteria evaluation for locating a gas power plant in Natanz City in Iran. The multi-criteria decision framework offers a hierarchy model to select a suitable place for a gas power plant. This framework includes analytic hierarchy process, fuzzy set theory and weighted linear combination. The analytic hierarchy process was applied to compare the importance of criteria among hierarchy elements classified by environmental group criteria. In the next step, the fuzzy logic was used to regulate the criteria through various fuzzy membership functions and fuzzy layers were formed by using fuzzy operators in the Arc-GIS environment. Subsequently, they were categorized into 6 classes using reclassify function. Then weighted linear combination was applied to combine the research layers. Finally, the two approaches were analyzed to find the most suitable place to set up a gas power plant. According to the results, the utilization of GAMMA fuzzy operator was shown to be suitable for this site selection.

  18. An active learning representative subset selection method using net analyte signal

    Science.gov (United States)

    He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi

    2018-05-01

    To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.

  19. Hazardous Waste Landfill Siting using GIS Technique and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Ozeair Abessi

    2010-07-01

    Full Text Available Disposal of large amount of generated hazardous waste in power plants, has always received communities' and authori¬ties attentions. In this paper using site screening method and Analytical Hierarchy Process (AHP a sophisticated approach for siting hazardous waste landfill in large areas is presented. This approach demonstrates how the evaluation criteria such as physical, socio-economical, technical, environmental and their regulatory sub criteria can be introduced into an over layer technique to screen some limited appropriate zones in the area. Then, in order to find the optimal site amongst the primary screened site utilizing a Multiple Criteria Decision Making (MCDM method for hierarchy computations of the process is recommended. Using the introduced method an accurate siting procedure for environmental planning of the landfills in an area would be enabled. In the study this approach was utilized for disposal of hazardous wastes of Shahid Rajaee thermal power plant located in Qazvin province west central part of Iran. As a result of this study 10 suitable zones were screened in the area at first, then using analytical hierarchy process a site near the power plant were chosen as the optimal site for landfilling of the hazardous wastes in Qazvin province.

  20. Applied research on air pollution using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    1994-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  1. New Analytical Solution of the Equilibrium Ampere's Law Using the Walker's Method: a Didactic Example

    Science.gov (United States)

    Sousa, A. N. Laurindo; Ojeda-González, A.; Prestes, A.; Klausner, V.; Caritá, L. A.

    2018-02-01

    This work aims to demonstrate the analytical solution of the Grad-Shafranov (GS) equation or generalized Ampere's law, which is important in the studies of self-consistent 2.5-D solution for current sheet structures. A detailed mathematical development is presented to obtain the generating function as shown by Walker (RSPSA 91, 410, 1915). Therefore, we study the general solution of the GS equation in terms of the Walker's generating function in details without omitting any step. The Walker's generating function g( ζ) is written in a new way as the tangent of an unspecified function K( ζ). In this trend, the general solution of the GS equation is expressed as exp(- 2Ψ) = 4| K '( ζ)|2/cos2[ K( ζ) - K( ζ ∗)]. In order to investigate whether our proposal would simplify the mathematical effort to find new generating functions, we use Harris's solution as a test, in this case K( ζ) = arctan(exp( i ζ)). In summary, one of the article purposes is to present a review of the Harris's solution. In an attempt to find a simplified solution, we propose a new way to write the GS solution using g( ζ) = tan( K( ζ)). We also present a new analytical solution to the equilibrium Ampere's law using g( ζ) = cosh( b ζ), which includes a generalization of the Harris model and presents isolated magnetic islands.

  2. Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)

    Science.gov (United States)

    Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.

  3. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    Science.gov (United States)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify

  4. A Novel Analytical Model for Network-on-Chip using Semi-Markov Process

    Directory of Open Access Journals (Sweden)

    WANG, J.

    2011-02-01

    Full Text Available Network-on-Chip (NoC communication architecture is proposed to resolve the bottleneck of Multi-processor communication in a single chip. In this paper, a performance analytical model using Semi-Markov Process (SMP is presented to obtain the NoC performance. More precisely, given the related parameters, SMP is used to describe the behavior of each channel and the header flit routing time on each channel can be calculated by analyzing the SMP. Then, the average packet latency in NoC can be calculated. The accuracy of our model is illustrated through simulation. Indeed, the experimental results show that the proposed model can be used to obtain NoC performance and it performs better than the state-of-art models. Therefore, our model can be used as a useful tool to guide the NoC design process.

  5. Mastering JavaScript promises

    CERN Document Server

    Hussain, Muzzamil

    2015-01-01

    This book is for all the software and web engineers wanting to apply the promises paradigm to their next project and get the best outcome from it. This book also acts as a reference for the engineers who are already using promises in their projects and want to improve their current knowledge to reach the next level. To get the most benefit from this book, you should know basic programming concepts, have a familiarity with JavaScript, and a good understanding of HTML.

  6. Identifying bioaccumulative halogenated organic compounds using a nontargeted analytical approach: seabirds as sentinels.

    Directory of Open Access Journals (Sweden)

    Christopher J Millow

    Full Text Available Persistent organic pollutants (POPs are typically monitored via targeted mass spectrometry, which potentially identifies only a fraction of the contaminants actually present in environmental samples. With new anthropogenic compounds continuously introduced to the environment, novel and proactive approaches that provide a comprehensive alternative to targeted methods are needed in order to more completely characterize the diversity of known and unknown compounds likely to cause adverse effects. Nontargeted mass spectrometry attempts to extensively screen for compounds, providing a feasible approach for identifying contaminants that warrant future monitoring. We employed a nontargeted analytical method using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOF-MS to characterize halogenated organic compounds (HOCs in California Black skimmer (Rynchops niger eggs. Our study identified 111 HOCs; 84 of these compounds were regularly detected via targeted approaches, while 27 were classified as typically unmonitored or unknown. Typically unmonitored compounds of note in bird eggs included tris(4-chlorophenylmethane (TCPM, tris(4-chlorophenylmethanol (TCPMOH, triclosan, permethrin, heptachloro-1'-methyl-1,2'-bipyrrole (MBP, as well as four halogenated unknown compounds that could not be identified through database searching or the literature. The presence of these compounds in Black skimmer eggs suggests they are persistent, bioaccumulative, potentially biomagnifying, and maternally transferring. Our results highlight the utility and importance of employing nontargeted analytical tools to assess true contaminant burdens in organisms, as well as to demonstrate the value in using environmental sentinels to proactively identify novel contaminants.

  7. Detection of mercury(II) ions using colorimetric gold nanoparticles on paper-based analytical devices.

    Science.gov (United States)

    Chen, Guan-Hua; Chen, Wei-Yu; Yen, Yu-Chun; Wang, Chia-Wei; Chang, Huan-Tsung; Chen, Chien-Fu

    2014-07-15

    An on-field colorimetric sensing strategy employing gold nanoparticles (AuNPs) and a paper-based analytical platform was investigated for mercury ion (Hg(2+)) detection at water sources. By utilizing thymine-Hg(2+)-thymine (T-Hg(2+)-T) coordination chemistry, label-free detection oligonucleotide sequences were attached to unmodified gold nanoparticles to provide rapid mercury ion sensing without complicated and time-consuming thiolated or other costly labeled probe preparation processes. Not only is this strategy's sensing mechanism specific toward Hg(2+), rather than other metal ions, but also the conformational change in the detection oligonucleotide sequences introduces different degrees of AuNP aggregation that causes the color of AuNPs to exhibit a mixture variance. To eliminate the use of sophisticated equipment and minimize the power requirement for data analysis and transmission, the color variance of multiple detection results were transferred and concentrated on cellulose-based paper analytical devices, and the data were subsequently transmitted for the readout and storage of results using cloud computing via a smartphone. As a result, a detection limit of 50 nM for Hg(2+) spiked pond and river water could be achieved. Furthermore, multiple tests could be performed simultaneously with a 40 min turnaround time. These results suggest that the proposed platform possesses the capability for sensitive and high-throughput on-site mercury pollution monitoring in resource-constrained settings.

  8. Determination of alpha-naphthol by an oscillating chemical reaction using the analyte pulse perturbation technique

    International Nuclear Information System (INIS)

    Yang Wu; Sun Kanjun; Lv Weilian; Bo Lili; He Xiaoyan; Suo Nan; Gao Jinzhang

    2005-01-01

    An analytical method for the determination of alpha-naphthol (α-NP) is proposed by the sequential perturbation caused by different amounts of alpha-naphthol on the oscillating chemical system involving the Cu(II)-catalyzed oscillating reaction between hydrogen peroxide and sodium thiocyanate in an alkaline medium with the aid of continuous-flow stirred tank reactor (CSTR). The method relies on the linear relationship between the changes in the oscillation amplitude of the chemical system and the concentration of alpha-naphthol. The use of the analyte pulse perturbation technique permits sequential determinations in the same oscillating system owing to the expeditiousness with which the steady state is regained after each perturbation. The calibration curve obeys a linear equation very well when the concentration of alpha-naphthol is over the range 0.034-530 umol/L (r = 0.9991). Influences of temperature, injection points, flow rate and reaction variables on the oscillating system are investigated in detail and the possible mechanism of action of alpha-naphthol to the chemical oscillating system is also discussed. The method has been successfully used for the determination of α-naphthol in carbaryl hydrolysates

  9. Analytical reverse time migration: An innovation in imaging of infrastructures using ultrasonic shear waves.

    Science.gov (United States)

    Asadollahi, Aziz; Khazanovich, Lev

    2018-04-11

    The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    International Nuclear Information System (INIS)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García; Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D.; Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz

    2015-01-01

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs

  11. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila; Lambas, Diego García [Instituto de Astronomía Teórica y Experimental, CONICET-UNC, Laprida 854, X5000BGR, Córdoba (Argentina); Cora, Sofía A.; Martínez, Cristian A. Vega-; Gargiulo, Ignacio D. [Consejo Nacional de Investigaciones Científicas y Técnicas, Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Padilla, Nelson D.; Tecce, Tomás E.; Orsi, Álvaro; Arancibia, Alejandra M. Muñoz, E-mail: andresnicolas@oac.uncor.edu [Instituto de Astrofísica, Pontificia Universidad Católica de Chile, Av. Vicuña Mackenna 4860, Santiago (Chile)

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observed galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.

  12. Using learning analytics to evaluate a video-based lecture series.

    Science.gov (United States)

    Lau, K H Vincent; Farooque, Pue; Leydon, Gary; Schwartz, Michael L; Sadler, R Mark; Moeller, Jeremy J

    2018-01-01

    The video-based lecture (VBL), an important component of the flipped classroom (FC) and massive open online course (MOOC) approaches to medical education, has primarily been evaluated through direct learner feedback. Evaluation may be enhanced through learner analytics (LA) - analysis of quantitative audience usage data generated by video-sharing platforms. We applied LA to an experimental series of ten VBLs on electroencephalography (EEG) interpretation, uploaded to YouTube in the model of a publicly accessible MOOC. Trends in view count; total percentage of video viewed and audience retention (AR) (percentage of viewers watching at a time point compared to the initial total) were examined. The pattern of average AR decline was characterized using regression analysis, revealing a uniform linear decline in viewership for each video, with no evidence of an optimal VBL length. Segments with transient increases in AR corresponded to those focused on core concepts, indicative of content requiring more detailed evaluation. We propose a model for applying LA at four levels: global, series, video, and feedback. LA may be a useful tool in evaluating a VBL series. Our proposed model combines analytics data and learner self-report for comprehensive evaluation.

  13. Evaluation of gamma dose effect on PIN photodiode using analytical model

    Science.gov (United States)

    Jafari, H.; Feghhi, S. A. H.; Boorboor, S.

    2018-03-01

    The PIN silicon photodiodes are widely used in the applications which may be found in radiation environment such as space mission, medical imaging and non-destructive testing. Radiation-induced damage in these devices causes to degrade the photodiode parameters. In this work, we have used new approach to evaluate gamma dose effects on a commercial PIN photodiode (BPX65) based on an analytical model. In this approach, the NIEL parameter has been calculated for gamma rays from a 60Co source by GEANT4. The radiation damage mechanisms have been considered by solving numerically the Poisson and continuity equations with the appropriate boundary conditions, parameters and physical models. Defects caused by radiation in silicon have been formulated in terms of the damage coefficient for the minority carriers' lifetime. The gamma induced degradation parameters of the silicon PIN photodiode have been analyzed in detail and the results were compared with experimental measurements and as well as the results of ATLAS semiconductor simulator to verify and parameterize the analytical model calculations. The results showed reasonable agreement between them for BPX65 silicon photodiode irradiated by 60Co gamma source at total doses up to 5 kGy under different reverse voltages.

  14. Analytical steady-state solutions for water-limited cropping systems using saline irrigation water

    Science.gov (United States)

    Skaggs, T. H.; Anderson, R. G.; Corwin, D. L.; Suarez, D. L.

    2014-12-01

    Due to the diminishing availability of good quality water for irrigation, it is increasingly important that irrigation and salinity management tools be able to target submaximal crop yields and support the use of marginal quality waters. In this work, we present a steady-state irrigated systems modeling framework that accounts for reduced plant water uptake due to root zone salinity. Two explicit, closed-form analytical solutions for the root zone solute concentration profile are obtained, corresponding to two alternative functional forms of the uptake reduction function. The solutions express a general relationship between irrigation water salinity, irrigation rate, crop salt tolerance, crop transpiration, and (using standard approximations) crop yield. Example applications are illustrated, including the calculation of irrigation requirements for obtaining targeted submaximal yields, and the generation of crop-water production functions for varying irrigation waters, irrigation rates, and crops. Model predictions are shown to be mostly consistent with existing models and available experimental data. Yet the new solutions possess advantages over available alternatives, including: (i) the solutions were derived from a complete physical-mathematical description of the system, rather than based on an ad hoc formulation; (ii) the analytical solutions are explicit and can be evaluated without iterative techniques; (iii) the solutions permit consideration of two common functional forms of salinity induced reductions in crop water uptake, rather than being tied to one particular representation; and (iv) the utilized modeling framework is compatible with leading transient-state numerical models.

  15. Analytical description of thermodynamic properties of steam, water and the phase interface for use in CFD

    Science.gov (United States)

    Hrubý, Jan; Duška, Michal

    2014-03-01

    We present a system of analytical equations for computation of all thermodynamic properties of dry steam and liquid water (undesaturated, saturated and metastable supersaturated) and properties of the liquid-vapor phase interface. The form of the equations is such that it enables computation of all thermodynamic properties for independent variables directly related to the balanced quantities - total mass, liquid mass, energy, momenta. This makes it suitable for the solvers of fluid dynamics equations in the conservative form. Thermodynamic properties of dry steam and liquid water are formulated in terms of special thermodynamic potentials and all properties are obtained as analytical derivatives. For the surface tension, the IAPWS formula is used. The interfacial internal energy is derived from the surface tension and it is used in the energy balance. Unlike common models, the present one provides real (contrary to perfect gas approximation) properties of steam and water and reflects the energetic effects due to the surface tension. The equations are based on re-fitting the reference formulation IAPWS-95 and selected experimental data. The mathematical structure of the equations is optimized for fast computation.

  16. Analytical description of thermodynamic properties of steam, water and the phase interface for use in CFD

    Directory of Open Access Journals (Sweden)

    Hrubý Jan

    2014-03-01

    Full Text Available We present a system of analytical equations for computation of all thermodynamic properties of dry steam and liquid water (undesaturated, saturated and metastable supersaturated and properties of the liquid-vapor phase interface. The form of the equations is such that it enables computation of all thermodynamic properties for independent variables directly related to the balanced quantities - total mass, liquid mass, energy, momenta. This makes it suitable for the solvers of fluid dynamics equations in the conservative form. Thermodynamic properties of dry steam and liquid water are formulated in terms of special thermodynamic potentials and all properties are obtained as analytical derivatives. For the surface tension, the IAPWS formula is used. The interfacial internal energy is derived from the surface tension and it is used in the energy balance. Unlike common models, the present one provides real (contrary to perfect gas approximation properties of steam and water and reflects the energetic effects due to the surface tension. The equations are based on re-fitting the reference formulation IAPWS-95 and selected experimental data. The mathematical structure of the equations is optimized for fast computation.

  17. Promise and peril: Dissemination of findings from studies of drugs used in pregnancy and their association with birth defects.

    Science.gov (United States)

    Patrick, Stephen W; Cooper, William O

    2015-08-01

    When and how to publish birth defects research can be complex, especially in the context of drugs used in pregnancy. Such research frequently involves multiple stakeholders, including regulatory agencies. Researchers must balance the potential peril of an unnecessarily panicked populace versus the benefit of protecting the public's health. We use a case presentation and contemporary literature to highlight the potential tradeoffs that researchers must consider. We highlight important considerations including the public health impact, examining the likelihood of causality, understanding common considerations when using large data sources, the role of peer review and working in partnership with regulatory agencies. We suggest that plans for analyses, dissemination and risk communication are done best a priori and not post hoc. Rigorous research evaluating the impact of drugs used in pregnancy, coupled with effective dissemination strategies, has the potential improve outcomes for mothers and their infants for generations. © 2015 Wiley Periodicals, Inc.

  18. Evaluation of feeds for melt and dilute process using an analytical hierarchy process

    International Nuclear Information System (INIS)

    Krupa, J.F.

    2000-01-01

    Westinghouse Savannah River Company was requested to evaluate whether nuclear materials other than aluminum-clad spent nuclear fuel should be considered for treatment to prepare them for disposal in the melt and dilute facility as part of the Treatment and Storage Facility currently projected for construction in the L-Reactor process area. The decision analysis process used to develop this analysis considered many variables and uncertainties, including repository requirements that are not yet finalized. The Analytical Hierarchy Process using a ratings methodology was used to rank potential feed candidates for disposition through the Melt and Dilute facility proposed for disposition of Savannah River Site aluminum-clad spent nuclear fuel. Because of the scoping nature of this analysis, the expert team convened for this purpose concentrated on technical feasibility and potential cost impacts associated with using melt and dilute versus the current disposition option. This report documents results of the decision analysis

  19. Use of heteroligand complexes in analytical chemistry of niobium and tantalum

    International Nuclear Information System (INIS)

    Elinson, S.V.

    1975-01-01

    A review of modern precise spectrophotometric and extraction-spectrophotometric methods for analyzing Nb and Ta is presented. These methods are based on the use of multi-ligand (trinary, quaternary, mixed) complexes of these elements with organic reagents. To develop extraction-photometric methods for quantitative analysis of Ta in the presence of Nb use is made of complexes which these elements form with polyphenols, azo compounds, metallochromic indicators and heteropolyacids. The extraction-photometric methods are based on the use of multi-ligand complexes of the ionic association type. Owing to this the volumetric and gravimetric methods can be used here. The research on multi-ligand complexes of Nb and Ta with a number of reagents provides a basis for developing precise analytical methods for determining Nb and Ta in different materials and in the presence of many other elements without their separation

  20. Evaluation of feeds for melt and dilute process using an analytical hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Krupa, J.F.

    2000-03-22

    Westinghouse Savannah River Company was requested to evaluate whether nuclear materials other than aluminum-clad spent nuclear fuel should be considered for treatment to prepare them for disposal in the melt and dilute facility as part of the Treatment and Storage Facility currently projected for construction in the L-Reactor process area. The decision analysis process used to develop this analysis considered many variables and uncertainties, including repository requirements that are not yet finalized. The Analytical Hierarchy Process using a ratings methodology was used to rank potential feed candidates for disposition through the Melt and Dilute facility proposed for disposition of Savannah River Site aluminum-clad spent nuclear fuel. Because of the scoping nature of this analysis, the expert team convened for this purpose concentrated on technical feasibility and potential cost impacts associated with using melt and dilute versus the current disposition option. This report documents results of the decision analysis.

  1. Functional-analytical capabilities of GIS technology in the study of water use risks

    International Nuclear Information System (INIS)

    Nevidimova, O G; Yankovich, E P; Yankovich, K S

    2015-01-01

    Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved

  2. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Koch, Nicholas C; Newhauser, Wayne D

    2010-01-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  3. Reducing Vibrio load in Artemia nauplii using antimicrobial photodynamic therapy: a promising strategy to reduce antibiotic application in shrimp larviculture

    Digital Repository Service at National Institute of Oceanography (India)

    Aparna, A.; Arshad, E.; Jasmin, C.; Pai, S.S.; BrightSingh, I.S.; Mohandas, A.; Anas, A.

    by treating the cells with Rose Bengal and photosensitizing for 30 min using a halogen lamp. This resulted in the death of more than 50% of the cells within the first 10 min of exposure and the 50% reduction in the cell wall integrity after 30 min could...

  4. Opting out of the peaceful use of nuclear power in Germany. A promising special approach in the European context?

    International Nuclear Information System (INIS)

    Buedenbender, Martin

    2009-01-01

    Nuclear power is in the focus of politics and public attention in Germany not only because of the federal elections. Again and again, voices are heard which doubt the decision taken in 2000 to opt out of the use of nuclear power. The change in parliamentary majority in favor of the alliance of CDU/CSU and FDP as a result of the elections on September 27 is leading to another review of the opt-out decision, as the three parties in their platforms expressed themselves in favor of extending nuclear power plant life. This makes a stocktaking exercise of all salient arguments imperative at the present juncture. The perspective in that case should not be restricted to national aspects but include especially the influence of the European dimension of the subject. Present political positions in the 27 EU countries indicate a renaissance of nuclear power. Numerous countries, such as Italy, Sweden, Poland or the United Kingdom, revoked their historic opt-out decisions, are using nuclear power for the first time, or want to expand greatly the nuclear share in their electricity generation mix. All 3 European agencies with clear majorities advocate the extensive use of nuclear power as a long-term component of the mix of energy resources. Germany, with its decision to opt out of the use of nuclear power, is part of a minority. Being part of a European electricity market which will grow together more and more closely up to complete integration, Germany will always be supplied electricity from nuclear sources in the long run. This will be true irrespective of nuclear power plants being operated in the country. So, shutting down German nuclear power plants will not achieve the goals of nuclear opponents but merely give rise to additional challenges in power technology in an effort to ensure Germany's electricity supply. For this reason, the new German federal government should revoke the decision to opt out of the peaceful use of nuclear power. (orig.)

  5. Dabigatran etexilate for nonvalvular atrial fibrillation in real practice and promises for its use to prevent stroke

    Directory of Open Access Journals (Sweden)

    A.V. Fonyakin

    2014-01-01

    Full Text Available The capabilities of antithrombotic therapy to prevent thromboembolic events in nonvalvular atrial fibrillation (AF are substantially extended after designing and clinically introducing new oral anticoagulants, one of which is dabigatran. A wealth of world clinical experience with dabigatran has confirmed its efficacy and safety provided that all recommendations for dosage regimens are followed. The universal properties of the drug can hope that the indications for its use will be extended and will not be confined to the prevention and treatment of venous and atrial thromboses and thromboembolisms. Whether dabigatran may be used in acute myocardial infarction and coronary stenting in the presence of nonvalvular AF, left ventricular thrombosis, and cardiomyopathies is being considered today.

  6. A hedonic response to an assay made on octogenarians using cosmetics that promise to donate comfort, attractiveness and youth

    Directory of Open Access Journals (Sweden)

    Lorenzo Martini

    2017-04-01

    Full Text Available We have been formulating cosmetics for thirty years and We am deeply conscious that Senescence is an irreversible physiological phenomenon and the unique way to escape decline is prevention (by using drastic moisturizers since early age or the camouflage in elderly or just before male climacteric or menopause, as famous Empresses did in XVI and XVII century. Purpose of this odd study is to disclaim a revolutionary method to investigate upon the “hedonic response” only septuagenarians and octogenarians, living in luxury a rest home, may display after having used natural cosmetics with the clear advertising that they are only remedies apt to let them feel younger, attractive and active. The volunteers were invited to respond to simplest questionnaire about satisfaction, according to the Peyram’s hedonic test.

  7. The Flipped MOOC: Using Gamification and Learning Analytics in MOOC Design—A Conceptual Approach

    Directory of Open Access Journals (Sweden)

    Roland Klemke

    2018-02-01

    Full Text Available Recently, research has highlighted the potential of Massive Open Online Courses (MOOCs for education, as well as their drawbacks, which are well known. Several studies state that the main limitations of the MOOCs are low completion and high dropout rates of participants. However, MOOCs suffer also from the lack of participant engagement, personalization, and despite the fact that several formats and types of MOOCs are reported in the literature, the majority of them contain a considerable amount of content that is mainly presented in a video format. This is in contrast to the results reported in other educational settings, where engagement and active participation are identified as success factors. We present the results of a study that involved educational experts and learning scientists giving new and interesting insights towards the conceptualization of a new design approach, the flipped MOOC, applying the flipped classroom approach to the MOOCs’ design and making use of gamification and learning analytics. We found important indications, applicable to the concept of a flipped MOOC, which entails turning MOOCs from mainly content-oriented delivery machines into personalized, interactive, and engaging learning environments. Our findings support the idea that MOOCs can be enriched by the orchestration of a flipped classroom approach in combination with the support of gamification and learning analytics.

  8. Research prioritization using the Analytic Hierarchy Process: basic methods. Volume 1

    International Nuclear Information System (INIS)

    Vesely, W.E.; Shafaghi, A.; Gary, I. Jr.; Rasmuson, D.M.

    1983-08-01

    This report describes a systematic approach for prioritizing research needs and research programs. The approach is formally called the Analytic Hierarchy Process which was developed by T.L. Saaty and is described in several of his texts referenced in the report. The Analytic Hierarchy Process, or AHP for short, has been applied to a wide variety of prioritization problems and has a good record of success as documented in Saaty's texts. The report develops specific guidelines for constructing the hierarchy and for prioritizing the research programs. Specific examples are given to illustrate the steps in the AHP. As part of the work, a computer code has been developed and the use of the code is described. The code allows the prioritizations to be done in a codified and efficient manner; sensitivity and parametric studies can also be straightforwardly performed to gain a better understanding of the prioritization results. Finally, as an important part of the work, an approach is developed which utilizes probabilistic risk analyses (PRAs) to systematically identify and prioritize research needs and research programs. When utilized in an AHP framework, the PRA's which have been performed to date provide a powerful information source for focusing research on those areas most impacting risk and risk uncertainty

  9. An analytical and numerical study of solar chimney use for room natural ventilation

    Energy Technology Data Exchange (ETDEWEB)

    Bassiouny, Ramadan; Koura, Nader S.A. [Department of Mechanical Power Engineering and Energy, Minia University, Minia 61111 (Egypt)

    2008-07-01

    The solar chimney concept used for improving room natural ventilation was analytically and numerically studied. The study considered some geometrical parameters such as chimney inlet size and width, which are believed to have a significant effect on space ventilation. The numerical analysis was intended to predict the flow pattern in the room as well as in the chimney. This would help optimizing design parameters. The results were compared with available published experimental and theoretical data. There was an acceptable trend match between the present analytical results and the published data for the room air change per hour, ACH. Further, it was noticed that the chimney width has a more significant effect on ACH compared to the chimney inlet size. The results showed that the absorber average temperature could be correlated to the intensity as: (T{sub w} = 3.51I{sup 0.461}) with an accepted range of approximation error. In addition the average air exit velocity was found to vary with the intensity as ({nu}{sub ex} = 0.013I{sup 0.4}). (author)

  10. MAXIMIZING SOCIAL VALUE IN THE HOTEL ONLINE ENVIRONMENT USING AN ANALYTIC HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    Carmen Păunescu

    2018-03-01

    Full Text Available The paper analyses the possibilities that hoteliers have to create and maximize the social value of their online platforms, in terms of their functionality and usage, in order to improve sales and increase hotels’ performance. It also discusses the opportunities that hotel managers can take to improve the hotel online decision-making strategy to convert more effectively visitors into actual customers. Although social value creation of online platforms has been well researched in the specialized literature, recent research has not examined the ways the online social value can be maximized and put into effective commercial use. The paper reviews the dimensions and characteristics of the hotel online environment by integrating literature analysis and field research practices. It employs the analytic hierarchy process method to analyse key elements of the hotel online environment that can serve as a focal point for value creation. The literature review and field research conducted pinpoint three possibilities of creating online social value: (a building online trust, (b ensuring high quality of the online service, and (c providing effective online communication experience. The paper results have given deeper understanding regarding potential areas of the hotel online environment where social value can be obtained. They prove applicability of the analytic hierarchy process method for evaluation and selection of strategies for online social value creation. At the same time, the paper provides new valuable insights to hoteliers, which might support their decisions to improve the business by proactively incorporating strategies for online social value maximization.

  11. Informative gene selection using Adaptive Analytic Hierarchy Process (A2HP

    Directory of Open Access Journals (Sweden)

    Abhishek Bhola

    2017-12-01

    Full Text Available Gene expression dataset derived from microarray experiments are marked by large number of genes, which contains the gene expression values at different sample conditions/time-points. Selection of informative genes from these large datasets is an issue of major concern for various researchers and biologists. In this study, we propose a gene selection and dimensionality reduction method called Adaptive Analytic Hierarchy Process (A2HP. Traditional analytic hierarchy process is a multiple-criteria based decision analysis method whose result depends upon the expert knowledge or decision makers. It is mainly used to solve the decision problems in different fields. On the other hand, A2HP is a fused method that combines the outcomes of five individual gene selection ranking methods t-test, chi-square variance test, z-test, wilcoxon test and signal-to-noise ratio (SNR. At first, the preprocessing of gene expression dataset is done and then the reduced number of genes obtained, will be fed as input for A2HP. A2HP utilizes both quantitative and qualitative factors to select the informative genes. Results demonstrate that A2HP selects efficient number of genes as compared to the individual gene selection methods. The percentage of deduction in number of genes and time complexity are taken as the performance measure for the proposed method. And it is shown that A2HP outperforms individual gene selection methods.

  12. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing

    Directory of Open Access Journals (Sweden)

    Samsinar Riza

    2018-01-01

    Full Text Available The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  13. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    Science.gov (United States)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  14. Using i2b2 to Bootstrap Rural Health Analytics and Learning Networks.

    Science.gov (United States)

    Harris, Daniel R; Baus, Adam D; Harper, Tamela J; Jarrett, Traci D; Pollard, Cecil R; Talbert, Jeffery C

    2016-08-01

    We demonstrate that the open-source i2b2 (Informatics for Integrating Biology and the Bedside) data model can be used to bootstrap rural health analytics and learning networks. These networks promote communication and research initiatives by providing the infrastructure necessary for sharing data and insights across a group of healthcare and research partners. Data integration remains a crucial challenge in connecting rural healthcare sites with a common data sharing and learning network due to the lack of interoperability and standards within electronic health records. The i2b2 data model acts as a point of convergence for disparate data from multiple healthcare sites. A consistent and natural data model for healthcare data is essential for overcoming integration issues, but challenges such as those caused by weak data standardization must still be addressed. We describe our experience in the context of building the West Virginia/Kentucky Health Analytics and Learning Network, a collaborative, multi-state effort connecting rural healthcare sites.

  15. Use of evidence in a categorization task: analytic and holistic processing modes.

    Science.gov (United States)

    Greco, Alberto; Moretti, Stefania

    2017-11-01

    Category learning performance can be influenced by many contextual factors, but the effects of these factors are not the same for all learners. The present study suggests that these differences can be due to the different ways evidence is used, according to two main basic modalities of processing information, analytically or holistically. In order to test the impact of the information provided, an inductive rule-based task was designed, in which feature salience and comparison informativeness between examples of two categories were manipulated during the learning phases, by introducing and progressively reducing some perceptual biases. To gather data on processing modalities, we devised the Active Feature Composition task, a production task that does not require classifying new items but reproducing them by combining features. At the end, an explicit rating task was performed, which entailed assessing the accuracy of a set of possible categorization rules. A combined analysis of the data collected with these two different tests enabled profiling participants in regard to the kind of processing modality, the structure of representations and the quality of categorial judgments. Results showed that despite the fact that the information provided was the same for all participants, those who adopted analytic processing better exploited evidence and performed more accurately, whereas with holistic processing categorization is perfectly possible but inaccurate. Finally, the cognitive implications of the proposed procedure, with regard to involved processes and representations, are discussed.

  16. An analytic solution of projectile motion with the quadratic resistance law using the homotopy analysis method

    International Nuclear Information System (INIS)

    Yabushita, Kazuki; Yamashita, Mariko; Tsuboi, Kazuhiro

    2007-01-01

    We consider the problem of two-dimensional projectile motion in which the resistance acting on an object moving in air is proportional to the square of the velocity of the object (quadratic resistance law). It is well known that the quadratic resistance law is valid in the range of the Reynolds number: 1 x 10 3 ∼ 2 x 10 5 (for instance, a sphere) for practical situations, such as throwing a ball. It has been considered that the equations of motion of this case are unsolvable for a general projectile angle, although some solutions have been obtained for a small projectile angle using perturbation techniques. To obtain a general analytic solution, we apply Liao's homotopy analysis method to this problem. The homotopy analysis method, which is different from a perturbation technique, can be applied to a problem which does not include small parameters. We apply the homotopy analysis method for not only governing differential equations, but also an algebraic equation of a velocity vector to extend the radius of convergence. Ultimately, we obtain the analytic solution to this problem and investigate the validation of the solution

  17. Reducing Vibrio load in Artemia nauplii using antimicrobial photodynamic therapy: a promising strategy to reduce antibiotic application in shrimp larviculture

    Science.gov (United States)

    Asok, Aparna; Arshad, Esha; Jasmin, C.; Somnath Pai, S.; Bright Singh, I. S.; Mohandas, A.; Anas, Abdulaziz

    2012-01-01

    Summary We propose antimicrobial photodynamic therapy (aPDT) as an alternative strategy to reduce the use of antibiotics in shrimp larviculture systems. The growth of a multiple antibiotic resistant Vibrio harveyi strain was effectively controlled by treating the cells with Rose Bengal and photosensitizing for 30 min using a halogen lamp. This resulted in the death of > 50% of the cells within the first 10 min of exposure and the 50% reduction in the cell wall integrity after 30 min could be attributed to the destruction of outer membrane protein of V. harveyi by reactive oxygen intermediates produced during the photosensitization. Further, mesocosm experiments with V. harveyi and Artemia nauplii demonstrated that in 30 min, the aPDT could kill 78.9% and 91.2% of heterotrophic bacterial and Vibrio population respectively. In conclusion, the study demonstrated that aPDT with its rapid action and as yet unreported resistance development possibilities could be a propitious strategy to reduce the use of antibiotics in shrimp larviculture systems and thereby, avoid their hazardous effects on human health and the ecosystem at large. PMID:21951316

  18. Development and evaluation of analytical techniques for total chlorine in used oils and oil fuels

    International Nuclear Information System (INIS)

    Gaskill, A. Jr.; Estes, E.D.; Hardison, D.L.; Friedman, P.H.

    1990-01-01

    A current EPA regulation prohibits the sale for burning in nonindustrial boilers of used oils and oil fuels. This paper discusses how analytical techniques for determining total chlorine were evaluated to provide regulatory agencies and the regulated community with appropriate chlorine test methods. The techniques evaluated included oxygen bomb combustion followed by chemical titration or ion chromatography, instrumental microcoulometry, field test kits, and instrumental furnace/specific ion electrode determinator, a device based on the Beilstein reaction, and x-ray fluorescence spectrometry. These techniques were subjected to interlaboratory testing to estimate their precision, accuracy, and sensitivity. Virgin and used crankcase oils, hydraulic and metalworking oils, oil fuels and oil fuel blends with used oils were tested. The bomb techniques, one of the test kits, microcoulometry and all but one x-ray analyzer were found to be suitable for this application. The chlorine furnace and the Beilstein device were found to be inapplicable at the levels of interest

  19. Analytic solution for American strangle options using Laplace-Carson transforms

    Science.gov (United States)

    Kang, Myungjoo; Jeon, Junkee; Han, Heejae; Lee, Somin

    2017-06-01

    A strangle has been important strategy for options when the trader believes there will be a large movement in the underlying asset but are uncertain of which way the movement will be. In this paper, we derive analytic formula for the price of American strangle options. American strangle options can be mathematically formulated into the free boundary problems involving two early exercise boundaries. By using Laplace-Carson Transform(LCT), we can derive the nonlinear system of equations satisfied by the transformed value of two free boundaries. We then solve this nonlinear system using Newton's method and finally get the free boundaries and option values using numerical Laplace inversion techniques. We also derive the Greeks for the American strangle options as well as the value of perpetual American strangle options. Furthermore, we present various graphs for the free boundaries and option values according to the change of parameters.

  20. Acid in perchloroethylene scrubber solutions used in HTGR fuel preparation processes. Analytical chemistry studies

    International Nuclear Information System (INIS)

    Lee, D.A.

    1979-02-01

    Acids and corrosion products in used perchloroethylene scrubber solutions collected from HTGR fuel preparation processes have been analyzed by several analytical methods to determine the source and possible remedy of the corrosion caused by these solutions. Hydrochloric acid was found to be concentrated on the carbon particles suspended in perchloroethylene. Filtration of carbon from the scrubber solutions removed the acid corrosion source in the process equipment. Corrosion products chemisorbed on the carbon particles were identified. Filtered perchloroethylene from used scrubber solutions contained practically no acid. It is recommended that carbon particles be separated from the scrubber solutions immediately after the scrubbing process to remove the source of acid and that an inhibitor be used to prevent the hydrolysis of perchloroethylene and the formation of acids

  1. EVALUATION THE SUPPLIERS USING ANALYTIC HIERARCHY PROCESS(AHP: AN APPLICATION IN THE TEXTILE FIRM

    Directory of Open Access Journals (Sweden)

    AHMET ÖZTÜRK

    2013-06-01

    Full Text Available Supplier selection is a complex multi-criteria problem which includes both qualitative and quantitative criteria.  In order to select the suppliers, it is necessary to make a tradeoff between these criteria some of which may conflict. Different approaches are suggested to solve the supplier selection problem in the literature. Especially, Saaty’s analytic hierarchy process (AHS is more useful approach for the problem, because of its inherent capability to handle qualitative and quantitative criteria.The main purpose of this study is to solve the supplier selection problem of the textile firm by using AHS.  The proposed model consists of a hierarchical network of connections among five alternatives, seven main and thirteen sub criteria. Twenty-one pairwise comparison matrices were obtained by using focus group methodology.  The results are presented and the sensitivity analyses are performed. Finally, some suggestions and important clues for the management are presented.

  2. Analytical synthesis for four–bar mechanisms used in a pseudo–equatorial solar tracker

    Directory of Open Access Journals (Sweden)

    Juan Manuel González Mendoza

    2013-09-01

    Full Text Available Photovoltaic energy production systems generate electricity without emitting pollutants into the atmosphere and do so from a free, unlimited resource. The highest level of energy conversion from the photovoltaic panels can be obtained by placing them perpendicular to the sun’s rays falling on their surface; this is done by installing solar tracking systems. This work proposes the use of two four-bar mechanisms as the driving force for a solar tracker; we propose the use of analytical synthesis for such mechanisms. This procedure is aimed at optimising the transmission angle, increasing mechanical advantage and decreasing driving torque. A mathematical model was used to prove synthesis results and a prototype of the solar tracker was built.

  3. Analytical and experimental investigation of the coaxial plasma gun for use as a particle accelerator

    Science.gov (United States)

    Shriver, E. L.

    1972-01-01

    The coaxial plasma accelerator for use as a projectile accelerator is discussed. The accelerator is described physically and analytically by solution of circuit equations, and by solving for the magnetic pressures which are formed by the j cross B vector forces on the plasma. It is shown that the plasma density must be increased if the accelerator is to be used as a projectile accelerator. Three different approaches to increasing plasma density are discussed. When a magnetic field containment scheme was used to increase the plasma density, glass beads of 0.66 millimeter diameter were accelerated to 7 to 8 kilometers per second velocities. Glass beads of smaller diameter were accelerated to more than twice this velocity.

  4. SELECTION OF BUSINESS STRATEGIES FOR QUALITY IMPROVEMENT USING FUZZY ANALYTICAL HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    Prasun Das

    2010-12-01

    Full Text Available Fuzzy linguistic concepts are often used to enhance the traditional analytic hierarchy process (AHP in capturing the fuzziness and subjectiveness of decision makers' judgments. In this paper, fuzzy AHP methodology is adopted for selection of the strategies for business improvement in an Indian industry as a decision making problem. Due to simplicity and effectiveness, triangular fuzzy numbers are adopted as a reference to indicate the influence strength of each element in the hierarchy structure. The confidence level and the optimistic levels of multiple decision makers are captured by using ? -cut based fuzzy number methods. This fuzzy set theory based multi-attribute decision making method is found to be quite useful and effective in industrial environment.

  5. Evaluation of feeds for melt and dilute process using an Analytical Hierarchy Process

    International Nuclear Information System (INIS)

    Krupa, J.F.

    2000-01-01

    WSRC was requested to evaluate whether nuclear materials other than aluminum-clad spent nuclear fuel should be considered for treatment to prepare them for disposal in the melt and dilute facility as part of the Treatment and Storage Facility (TSF) currently projected for construction in the L-Reactor process area. The Analytical Hierarchy Process using a ratings methodology was used to rank potential feed candidates for disposition through the Melt and Dilute facility proposed for disposition of Savannah River Site aluminum-clad spent nuclear fuel. Because of the scoping nature of this analysis, the expert team convened for this purpose concentrated on technical feasibility and potential cost impacts associated with using melt and dilute versus the current disposition option

  6. Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.

    Science.gov (United States)

    Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren

    2014-12-01

    Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.

  7. Comparison of Heat Insulations for Cryogenic Tankers Using Analytical and Numerical Analysis

    Directory of Open Access Journals (Sweden)

    Ramón Miralbés Buil

    2013-01-01

    Full Text Available This paper presented a methodology for the design of heat insulations used in cryogenic tankers. This insulation usually comprises a combination of vacuum and perlite or vacuum and superinsulation. Concretely, it is a methodology to obtain the temperatures, heat fluxes, and so forth. Using analytical tools has been established, which is based on the equivalence with an electric circuit, and on numerical tools using finite elements. Results obtained with both methods are then compared. In addition, the influence of the outer finish of the external part, due to the effect of the solar radiation, is analyzed too, and the equations to determine the maximum time available to transport the cryogenic liquid have been established. All these aspects are applied to a specific cryogenic commercial vehicle.

  8. On the selection of optimized carbon nano tube synthesis method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Besharati, M. K.; Afaghi Khatibi, A.; Akbari, M.

    2008-01-01

    Evidence from the early and late industrializes shows that technology, as the commercial application of scientific knowledge, has been a major driver of industrial and economic development. International technology transfer is now being recognized as having played an important role in the development of the most successful late industrializes of the second half of the twentieth Century. Our society stands to be significantly influenced by carbon nano tubes, shaped by nano tube applications in every aspect, just as silicon-based technology still shapes society today. Nano tubes can be formed in various structures using several different processing methods. In this paper, the synthesis methods used to produce nano tubes in industrial or laboratory scales are discussed and a comparison is made. A technical feasibility study is conducted by using the multi criteria decision-making model, namely Analytic Hierarchy Process. The article ends with a discussion of selecting the best method of Technology Transferring of Carbon Nano tubes to Iran

  9. SPET reconstruction with a non-uniform attenuation coefficient using an analytical regularizing iterative method

    International Nuclear Information System (INIS)

    Soussaline, F.; LeCoq, C.; Raynaud, C.; Kellershohn, C.

    1982-09-01

    The aim of this study is to evaluate the potential of the RIM technique when used in brain studies. The analytical Regulatorizing Iterative Method (RIM) is designed to provide fast and accurate reconstruction of tomographic images when non-uniform attenuation is to be accounted for. As indicated by phantom studies, this method improves the contrast and the signal-to-noise ratio as compared to those obtained with FBP (Filtered Back Projection) technique. Preliminary results obtained in brain studies using AMPI-123 (isopropil-amphetamine I-123) are very encouraging in terms of quantitative regional cellular activity. However, the clinical usefulness of this mathematically accurate reconstruction procedure is going to be demonstrated in our Institution, in comparing quantitative data in heart or liver studies where control values can be obtained

  10. Factors Affecting the Location of Road Emergency Bases in Iran Using Analytical Hierarchy Process (AHP).

    Science.gov (United States)

    Bahadori, Mohammadkarim; Hajebrahimi, Ahmad; Alimohammadzadeh, Khalil; Ravangard, Ramin; Hosseini, Seyed Mojtaba

    2017-10-01

    To identify and prioritize factors affecting the location of road emergency bases in Iran using Analytical Hierarchy Process (AHP). This was a mixed method (quantitative-qualitative) study conducted in 2016. The participants in this study included the professionals and experts in the field of pre-hospital and road emergency services issues working in the Health Deputy of Iran Ministry of Health and Medical Education, which were selected using purposive sampling method. In this study at first, the factors affecting the location of road emergency bases in Iran were identified using literature review and conducting interviews with the experts. Then, the identified factors were scored and prioritized using the studied professionals and experts' viewpoints through using the analytic hierarchy process (AHP) technique and its related pair-wise questionnaire. The collected data were analyzed using MAXQDA 10.0 software to analyze the answers given to the open question and Expert Choice 10.0 software to determine the weights and priorities of the identified factors. The results showed that eight factors were effective in locating the road emergency bases in Iran from the viewpoints of the studied professionals and experts in the field of pre-hospital and road emergency services issues, including respectively distance from the next base, region population, topography and geographical situation of the region, the volume of road traffic, the existence of amenities such as water, electricity, gas, etc. and proximity to the village, accident-prone sites, University ownership of the base site, and proximity to toll-house. Among the eight factors which were effective in locating the road emergency bases from the studied professionals and experts' perspectives, "distance from the next base" and "region population" were respectively the most important ones which had great differences with other factors.

  11. Evaluation of drought stress tolerance in promising lines of chickpea (Cicer arietinum L. using drought resistance indices

    Directory of Open Access Journals (Sweden)

    Akbar Shabani

    2018-06-01

    Full Text Available Introduction Chickpea (Cicer arietinum L. is an annual grain legume or “pulse crop” that is 2th legume after soybean in the world and was cultivated in 60 country. Legume, spatially chickpea is the most important tolerant crop in arid and semi-arid country in western of Asia such as Iran. Chickpea can growth in poor soil and undesirable environment conditions. Drought is an important factors that influencing chickpea production and quality. As area of cultivation is in dryland conditions thus aim of researches is reach to tolerant genotypes. The objective of current study was to evaluate the genetic variation and drought resistance advanced genotypes in chickpea Materials and methods For investigation of genetic variation and drought resistance, 64 advanced genotypes were evaluated in a simple latis (LD with two replications under normal and drought stress conditions in deputy of Dryland Agricultural Research Institute of Kermanshah during 2013-2014 cropping season. Plant spacing was as plots with four rows in 4 m in length, 30 cm apart. The seed were sowed in row with 10 cm distance and the seeding rate was 33 seeds per m2 for all plots. At maturity stage after separation of border effects from each plot, grain yield was measured. Statistical analysis was performed using SAS, SPSS and STATISTICA packages. some drought resistance indices such as mean productivity (MP, geometric mean productivity (GMP, harmonic mean (HAM, stress tolerance index (STI, stress susceptibility index (SSI, yield index (YI, K1 and K2 were measured based on yield in both conditions. Also we used stress tolerance score (STS method for selection genotypes according to all indices. Results and discussion Study on correlation between Yp, Ys and drought resistance indices showed that Yp and Ys had positive and significant correlated with MP, GMP, STI, YI, HAM, K1 and K2 thus these indices were the most suitable drought tolerance criteria for screening of chickpea

  12. The use of intraoperative computed tomography navigation in pituitary surgery promises a better intraoperative orientation in special cases

    Directory of Open Access Journals (Sweden)

    Stefan Linsler

    2016-01-01

    Full Text Available Objective: The safety of endoscopic skull base surgery can be enhanced by accurate navigation in preoperative computed tomography (CT and magnetic resonance imaging (MRI. Here, we report our initial experience of real-time intraoperative CT-guided navigation surgery for pituitary tumors in childhood. Materials and Methods: We report the case of a 15-year-old girl with a huge growth hormone-secreting pituitary adenoma with supra- and perisellar extension. Furthermore, the skull base was infiltrated. In this case, we performed an endonasal transsphenoidal approach for debulking the adenoma and for chiasma decompression. We used an MRI neuronavigation (Medtronic Stealth Air System which was registered via intraoperative CT scan (Siemens CT Somatom. Preexisting MRI studies (navigation protocol were fused with the intraoperative CT scans to enable three-dimensional navigation based on MR and CT imaging data. Intraoperatively, we did a further CT scan for resection control. Results: The intraoperative accuracy of the neuronavigation was excellent. There was an adjustment of <1 mm. The navigation was very helpful for orientation on the destroyed skull base in the sphenoid sinus. After opening the sellar region and tumor debulking, we did a CT scan for resection control because the extent of resection was not credible evaluable in this huge infiltrating adenoma. Thereby, we were able to demonstrate a sufficient decompression of the chiasma and complete resection of the medial part of the adenoma in the intraoperative CT images. Conclusions: The use of intraoperative CT/MRI-guided neuronavigation for transsphenoidal surgery is a time-effective, safe, and technically beneficial technique for special cases.

  13. Search for promising compositions for developing new multiphase casting alloys based on Al-Cu-Mg matrix using thermodynamic calculations and mathematic simulation

    Science.gov (United States)

    Zolotorevskii, V. S.; Pozdnyakov, A. V.; Churyumov, A. Yu.

    2012-11-01

    A calculation-experimental study is carried out to improve the concept of searching for new alloying systems in order to develop new casting alloys using mathematical simulation methods in combination with thermodynamic calculations. The results show the high effectiveness of the applied methods. The real possibility of selecting the promising compositions with the required set of casting and mechanical properties is exemplified by alloys with thermally hardened Al-Cu and Al-Cu-Mg matrices, as well as poorly soluble additives that form eutectic components using mainly the calculation study methods and the minimum number of experiments.

  14. Green supply chain management strategy selection using analytic network process: case study at PT XYZ

    Science.gov (United States)

    Adelina, W.; Kusumastuti, R. D.

    2017-01-01

    This study is about business strategy selection for green supply chain management (GSCM) for PT XYZ by using Analytic Network Process (ANP). GSCM is initiated as a response to reduce environmental impacts from industrial activities. The purposes of this study are identifying criteria and sub criteria in selecting GSCM Strategy, and analysing a suitable GSCM strategy for PT XYZ. This study proposes ANP network with 6 criteria and 29 sub criteria, which are obtained from the literature and experts’ judgements. One of the six criteria contains GSCM strategy options, namely risk-based strategy, efficiency-based strategy, innovation-based strategy, and closed loop strategy. ANP solves complex GSCM strategy-selection by using a more structured process and considering green perspectives from experts. The result indicates that innovation-based strategy is the most suitable green supply chain management strategy for PT XYZ.

  15. Selecting the best rayon in customer’s perspective using fuzzy analytic hierarchy process

    Science.gov (United States)

    Sonjaya, E. G.; Paulus, E.; Hidayat, A.

    2018-03-01

    Annually, the best Rayon selection is conducted by the assessment team of PT.PLN (Persero) Cirebon with the goal to increase the spirit of company members in providing an improved service for customers. However, there is a problem in multiple criteria decision making in this case, which is the importance intensity of each criterion in the selection are often assessed subjectively. To solve this problem, Fuzzy Analytical Hierarchy Process are used to cover AHP scale deficiency in the form of ‘crisp’ numbers. So, it should be considered to use Fuzzy logic approach to handle uncertainty. Fuzzy approach, especially triangular fuzzy number towards AHP scale, are expected to minimize the handling of subjective input, which then will make a more objective result. Thus, this research was conducted to help the management or assessment team in the selection of the best Rayon with a more objective selection in according to the company criteria.

  16. Methods used by Elsam for monitoring precision and accuracy of analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Hinnerskov Jensen, J [Soenderjyllands Hoejspaendingsvaerk, Faelleskemikerne, Aabenraa (Denmark)

    1996-12-01

    Performing round robins at regular intervals is the primary method used by ELsam for monitoring precision and accuracy of analytical results. The firs round robin was started in 1974, and today 5 round robins are running. These are focused on: boiler water and steam, lubricating oils, coal, ion chromatography and dissolved gases in transformer oils. Besides the power plant laboratories in Elsam, the participants are power plant laboratories from the rest of Denmark, industrial and commercial laboratories in Denmark, and finally foreign laboratories. The calculated standard deviations or reproducibilities are compared with acceptable values. These values originate from ISO, ASTM and the like, or from own experiences. Besides providing the laboratories with a tool to check their momentary performance, the round robins are vary suitable for evaluating systematic developments on a long term basis. By splitting up the uncertainty according to methods, sample preparation/analysis, etc., knowledge can be extracted from the round robins for use in many other situations. (au)

  17. 2D XXZ model ground state properties using an analytic Lanczos expansion

    International Nuclear Information System (INIS)

    Witte, N.S.; Hollenberg, L.C.L.; Weihong Zheng

    1997-01-01

    A formalism was developed for calculating arbitrary expectation values for any extensive lattice Hamiltonian system using a new analytic Lanczos expansion, or plaquette expansion, and a recently proved exact theorem for ground state energies. The ground state energy, staggered magnetisation and the excited state gap of the 2D anisotropic antiferromagnetic Heisenberg Model are then calculated using this expansion for a range of anisotropy parameters and compared to other moment based techniques, such as the t-expansion, and spin-wave theory and series expansion methods. It was found that far from the isotropic point all moment methods give essentially very similar results, but near the isotopic point the plaquette expansion is generally better than the others. 20 refs., 6 tabs

  18. The comparative evaluation of expanded national immunization policies in Korea using an analytic hierarchy process.

    Science.gov (United States)

    Shin, Taeksoo; Kim, Chun-Bae; Ahn, Yang-Heui; Kim, Hyo-Youl; Cha, Byung Ho; Uh, Young; Lee, Joo-Heon; Hyun, Sook-Jung; Lee, Dong-Han; Go, Un-Yeong

    2009-01-29

    The purpose of this paper is to propose new evaluation criteria and an analytic hierarchy process (AHP) model to assess the expanded national immunization programs (ENIPs) and to evaluate two alternative health care policies. One of the alternative policies is that private clinics and hospitals would offer free vaccination services to children and the other of them is that public health centers would offer these free vaccination services. Our model to evaluate the ENIPs was developed using brainstorming, Delphi techniques, and the AHP model. We first used the brainstorming and Delphi techniques, as well as literature reviews, to determine 25 criteria with which to evaluate the national immunization policy; we then proposed a hierarchical structure of the AHP model to assess ENIPs. By applying the proposed AHP model to the assessment of ENIPs for Korean immunization policies, we show that free vaccination services should be provided by private clinics and hospitals rather than public health centers.

  19. Assessing electronic health record systems in emergency departments: Using a decision analytic Bayesian model.

    Science.gov (United States)

    Ben-Assuli, Ofir; Leshno, Moshe

    2016-09-01

    In the last decade, health providers have implemented information systems to improve accuracy in medical diagnosis and decision-making. This article evaluates the impact of an electronic health record on emergency department physicians' diagnosis and admission decisions. A decision analytic approach using a decision tree was constructed to model the admission decision process to assess the added value of medical information retrieved from the electronic health record. Using a Bayesian statistical model, this method was evaluated on two coronary artery disease scenarios. The results show that the cases of coronary artery disease were better diagnosed when the electronic health record was consulted and led to more informed admission decisions. Furthermore, the value of medical information required for a specific admission decision in emergency departments could be quantified. The findings support the notion that physicians and patient healthcare can benefit from implementing electronic health record systems in emergency departments. © The Author(s) 2015.

  20. A review on recent developments for biomolecule separation at analytical scale using microfluidic devices.

    Science.gov (United States)

    Tetala, Kishore K R; Vijayalakshmi, M A

    2016-02-04

    Microfluidic devices with their inherent advantages like the ability to handle 10(-9) to 10(-18) L volume, multiplexing of microchannels, rapid analysis and on-chip detection are proving to be efficient systems in various fields of life sciences. This review highlights articles published since 2010 that reports the use of microfluidic devices to separate biomolecules (DNA, RNA and proteins) using chromatography principles (size, charge, hydrophobicity and affinity) along with microchip capillary electrophoresis, isotachophoresis etc. A detailed overview of stationary phase materials and the approaches to incorporate them within the microchannels of microchips is provided as well as a brief overview of chemical methods to immobilize ligand(s). Furthermore, we review research articles that deal with microfluidic devices as analytical tools for biomolecule (DNA, RNA and protein) separation. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Priority Determination of Underwater Tourism Site Development in Gorontalo Province using Analytical Hierarchy Process (AHP)

    Science.gov (United States)

    Rohandi, M.; Tuloli, M. Y.; Jassin, R. T.

    2018-02-01

    This research aims to determine the development of priority of underwater tourism in Gorontalo province using the Analytical Hierarchy Process (AHP) method which is one of DSS methods applying Multi-Attribute Decision Making (MADM). This method used 5 criteria and 28 alternatives to determine the best priority of underwater tourism site development in Gorontalo province. Based on the AHP calculation it appeared that the best priority development of underwater tourism site is Pulau Cinta whose total AHP score is 0.489 or 48.9%. This DSS produced a reliable result, faster solution, time-saving, and low cost for the decision makers to obtain the best underwater tourism site to be developed.

  2. IT vendor selection model by using structural equation model & analytical hierarchy process

    Science.gov (United States)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  3. Kinetic calculations for miniature neutron source reactor using analytical and numerical techniques

    International Nuclear Information System (INIS)

    Ampomah-Amoako, E.

    2008-06-01

    The analytical methods, step change in reactivity and ramp change in reactivity as well as numerical methods, fixed point iteration and Runge Kutta-gill were used to simulate the initial build up of neutrons in a miniature neutron source reactor with and without temperature feedback effect. The methods were modified to include photo neutron concentration. PARET 7.3 was used to simulate the transients behaviour of Ghana Research Reactor-1. The PARET code was capable of simulating the transients for 2.1 mk and 4 mk insertions of reactivity with peak powers of 49.87 kW and 92.34 kW, respectively. PARET code however failed to simulate 6.71 mk of reactivity which was predicted by Akaho et al through TEMPFED. (au)

  4. Automated novel high-accuracy miniaturized positioning system for use in analytical instrumentation

    Science.gov (United States)

    Siomos, Konstadinos; Kaliakatsos, John; Apostolakis, Manolis; Lianakis, John; Duenow, Peter

    1996-01-01

    The development of three-dimensional automotive devices (micro-robots) for applications in analytical instrumentation, clinical chemical diagnostics and advanced laser optics, depends strongly on the ability of such a device: firstly to be positioned with high accuracy, reliability, and automatically, by means of user friendly interface techniques; secondly to be compact; and thirdly to operate under vacuum conditions, free of most of the problems connected with conventional micropositioners using stepping-motor gear techniques. The objective of this paper is to develop and construct a mechanically compact computer-based micropositioning system for coordinated motion in the X-Y-Z directions with: (1) a positioning accuracy of less than 1 micrometer, (the accuracy of the end-position of the system is controlled by a hard/software assembly using a self-constructed optical encoder); (2) a heat-free propulsion mechanism for vacuum operation; and (3) synchronized X-Y motion.

  5. Rapid Biosynthesis of AgNPs Using Soil Bacterium Azotobacter vinelandii With Promising Antioxidant and Antibacterial Activities for Biomedical Applications

    Science.gov (United States)

    Karunakaran, Gopalu; Jagathambal, Matheswaran; Gusev, Alexander; Torres, Juan Antonio Lopez; Kolesnikov, Evgeny; Kuznetsov, Denis

    2017-07-01

    Silver nanoparticles (AgNPs) are applied in various fields from electronics to biomedical applications as a result of their high surface-to-volume ratio. Even though different approaches are available for synthesis of AgNPs, a nontoxic method for the synthesis has not yet been developed. Thus, this study focused on developing an easy and ecofriendly approach to synthesize AgNPs using Azotobacter vinelandii culture extracts. The biosynthesized nanoparticles were further characterized by ultraviolet-visible (UV-Vis) spectroscopy, x-ray diffraction (XRD), Fourier transform infrared (FTIR), energy-dispersive spectrum, particle size distribution (PSD), and transmission electron microscopy (TEM). UV absorption noticed at 435 nm showed formation of AgNPs. The XRD pattern showed a face-centered cubic structure with broad peaks of 28.2°, 32.6°, 46.6°, 55.2°, 57.9°, and 67.8°. The FTIR confirmed the involvement of various functional groups in the biosynthesis of AgNPs. The PSD and TEM analyses showed spherical, well-distributed nanoparticles with an average size of 20-70 nm. The elemental studies confirmed the existence of pure AgNPs. The bacterial extract containing extracellular enzyme nitrate reductase converted silver nitrate into AgNPs. AgNPs significantly inhibited the growth of pathogenic bacteria such as Streptomyces fradiae (National Collection of Industrial Microorganisms (NCIM) 2419), Staphylococcus aureus (NCIM 2127), Escherichia coli (NCIM 2065), and Serratia marcescens (NCIM 2919). In addition, biosynthesized AgNPs were found to possess strong antioxidant activity. Thus, the results of this study revealed that biosynthesized AgNPs could serve as a lead in the development of nanomedicine.

  6. Prioritizing of effective factors on development of medicinal plants cultivation using analytic network process

    Directory of Open Access Journals (Sweden)

    Ghorbanali Rassam

    2014-07-01

    Full Text Available For the overall development of medicinal plants cultivation in Iran, there is a need to identify various effective factors on medicinal plant cultivation. A proper method for identifying the most effective factor on the development of the medicinal plants cultivation is essential. This research conducted in order to prioritizing of the effective criteria for the development of medicinal plant cultivation in North Khorasan province in Iran using Analytical Network Process (ANP method. The multi-criteria decision making (MCDM is suggested to be a viable method for factor selection and the analytic network process (ANP has been used as a tool for MCDM. For this purpose a list of effective factors offered to expert group. Then pair wise comparison questionnaires were distributed between relevant researchers and local producer experts of province to get their opinions about the priority of criteria and sub- criteria. The questionnaires were analyzed using Super Decision software. We illustrated the use of the ANP by ranking main effective factors such as economic, educational-extension services, cultural-social and supportive policies on development of medicinal plants. The main objective of the present study was to develop ANP as a decision making tool for prioritizing factors affecting the development of medicinal plants cultivation. Results showed that the ANP methodology was perfectly suited to tackling the complex interrelations involved in selection factor in this case. Also the results of the process revealed that among the factors, supporting the cultivation of medicinal plants, build the infrastructure for marketing support, having educated farmer and easy access to production input have most impact on the development of medicinal plant cultivation.

  7. Simple analytical technique for liquid scintillation counting of environmental carbon-14 using gel suspension method

    International Nuclear Information System (INIS)

    Okai, Tomio; Wakabayashi, Genichiro; Nagao, Kenjiro; Matoba, Masaru; Ohura, Hirotaka; Momoshima, Noriyuki; Kawamura, Hidehisa

    2000-01-01

    A simple analytical technique for liquid scintillation counting of environmental 14 C was developed. Commercially available gelling agent, N-lauroyl-L -glutamic -α,γ-dibutylamide, was used for the gel-formation of the samples (gel suspension method) and for the subsequent liquid scintillation counting of 14 C in the form of CaCO 3 . Our procedure for sample preparation is much simpler than that of the conventional methods and requires no special equipment. Self absorption, stability and reproducibility of gel suspension samples were investigated in order to evaluate the characteristics of the gel suspension method for 14 C activity measurement. The self absorption factor is about 70% and slightly decrease as CaCO 3 weight increase. This is considered to be mainly due to the absorption of β-rays and scintillation light by the CaCO 3 sample itself. No change of the counting rate for the gel suspension sample was observed for more than 2 years after the sample preparation. Four samples were used for checking the reproducibility of the sample preparation method. The same values were obtained for the counting rate of 24 C activity within the counting error. No change of the counting rate was observed for the 're-gelated' sample. These results show that the gel suspension method is appropriate for the 14 C activity measurement by the liquid scintillation counting method and useful for a long-term preservation of the sample for repeated measurement. The above analytical technique was applied to actual environmental samples in Fukuoka prefecture, Japan. Results obtained were comparable with those by other researchers and appear to be reasonable. Therefore, the newly developed technique is useful for the routine monitoring of environmental 14 C. (author)

  8. PARAMO: a PARAllel predictive MOdeling platform for healthcare analytic research using electronic health records.

    Science.gov (United States)

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R; Stewart, Walter F; Malin, Bradley; Sun, Jimeng

    2014-04-01

    Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: (1) cohort construction, (2) feature construction, (3) cross-validation, (4) feature selection, and (5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which (1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, (2) schedules the tasks in a topological ordering of the graph, and (3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3h in parallel compared to 9days if running sequentially. This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines

  9. Exploring maintenance policy selection using the Analytic Hierarchy Process; An application for naval ships

    International Nuclear Information System (INIS)

    Goossens, Adriaan J.M.; Basten, Rob J.I.

    2015-01-01

    In this paper we investigate maintenance policy selection (MPS) through the use of the Analytic Hierarchy Process (AHP). A maintenance policy is a policy that dictates which parameter triggers a maintenance action. In practice, selecting the right maintenance policy appears to be a difficult decision. We investigate MPS for naval ships, but our results have wider applicability. For our study we cooperate with the owner and operator of the ships, as well as with a shipbuilder and an original equipment manufacturer of naval ships. We apply a structured five step approach to obtain the relevant criteria that may make one policy preferable over another. The criteria are drawn from both literature and a series of interviews at several navy related companies and are structured into a hierarchy of criteria usable with the AHP. Additionally, we organize three workshops at the three different companies to test the AHP-based MPS approach in practice. We conclude that the AHP is well suited for maintenance policy selection in this broad setting, and that it provides a structured and detailed approach for MPS. Adding to that, it facilitates discussions during and after the sessions, creating a better understanding of the policy selection process. - Highlights: • We use the Analytic Hierarchy Process (AHP) for maintenance policy selection (MPS). • Using both interviews and case studies from the literature, we construct a hierarchy. • In sessions at 3 companies, we find that 1 hierarchy can be used for multiple assets. • The AHP creates a better understanding of the maintenance policy selection process. • Our work is on naval ships, but our approach and findings have wider applicability

  10. Enantioselectivity of mass spectrometry: challenges and promises.

    Science.gov (United States)

    Awad, Hanan; El-Aneed, Anas

    2013-01-01

    With the fast growing market of pure enantiomer drugs and bioactive molecules, new chiral-selective analytical tools have been instigated including the use of mass spectrometry (MS). Even though MS is one of the best analytical tools that has efficiently been used in several pharmaceutical and biological applications, traditionally MS is considered as a "chiral-blind" technique. This limitation is due to the MS inability to differentiate between two enantiomers of a chiral molecule based merely on their masses. Several approaches have been explored to assess the potential role of MS in chiral analysis. The first approach depends on the use of MS-hyphenated techniques utilizing fast and sensitive chiral separation tools such as liquid chromatography (LC), gas chromatography (GC), and capillary electrophoresis (CE) coupled to MS detector. More recently, several alternative separation techniques have been evaluated such as supercritical fluid chromatography (SFC) and capillary electrochromatography (CEC); the latter being a hybrid technique that combines the efficiency of CE with the selectivity of LC. The second approach is based on using the MS instrument solely for the chiral recognition. This method depends on the behavioral differences between enantiomers towards a foreign molecule and the ability of MS to monitor such differences. These behavioral differences can be divided into three types: (i) differences in the enantiomeric affinity for association with the chiral selector, (ii) differences of the enantiomeric exchange rate with a foreign reagent, and (iii) differences in the complex MS dissociation behaviors of the enantiomers. Most recently, ion mobility spectrometry was introduced to qualitatively and quantitatively evaluate chiral compounds. This article provides an overview of MS role in chiral analysis by discussing MS based methodologies and presenting the challenges and promises associated with each approach. © 2013 Wiley Periodicals, Inc.

  11. Prioritizing the client trust factors in electronic banking using analytic hierarchy process

    Directory of Open Access Journals (Sweden)

    Hossein vazifedust

    2014-04-01

    Full Text Available This paper prioritizes the trust factors among electronic banking clients of an Iranian bank named Parsian Bank. The study first analyzes and reviews the literature and interviews with experts of electronic banking and academicians and determines client trust as the most important factor for development of electronic banking. The study also determines different factors associated with trust, which includes individual factors, banking factors and infrastructural factors. The sample populations consist of 25 experts who are academicians, managers and bank officers, clients of electronic banking. The necessary data was collected through conducting interviews and questionnaires and they are analyzed using analytic hierarchy process (AHP. The research findings indicate that the attitudinal factors, telecommunication infrastructure and cultural factors were the most influential factors accordingly and the customer orientation and ease of access were the least influential factors.

  12. Polynomial approach method to solve the neutron point kinetics equations with use of the analytic continuation

    Energy Technology Data Exchange (ETDEWEB)

    Tumelero, Fernanda; Petersen, Claudio Zen; Goncalves, Glenio Aguiar [Universidade Federal de Pelotas, Capao do Leao, RS (Brazil). Programa de Pos Graduacao em Modelagem Matematica; Schramm, Marcelo [Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica

    2016-12-15

    In this work, we report a solution to solve the Neutron Point Kinetics Equations applying the Polynomial Approach Method. The main idea is to expand the neutron density and delayed neutron precursors as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions and the analytical continuation is used to determine the solutions of the next intervals. A genuine error control is developed based on an analogy with the Rest Theorem. For illustration, we also report simulations for different approaches types (linear, quadratic and cubic). The results obtained by numerical simulations for linear approximation are compared with results in the literature.

  13. Analytic cubic and quartic force fields using density-functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Ringholm, Magnus; Gao, Bin; Thorvaldsen, Andreas J.; Ruud, Kenneth [Centre for Theoretical and Computational Chemistry (CTCC), Department of Chemistry, University of Tromsø—The Arctic University of Norway, 9037 Tromsø (Norway); Jonsson, Dan [Centre for Theoretical and Computational Chemistry (CTCC), Department of Chemistry, University of Tromsø—The Arctic University of Norway, 9037 Tromsø (Norway); High Performance Computing Group, University of Tromsø—The Arctic University of Norway, 9037 Tromsø (Norway); Bast, Radovan [Theoretical Chemistry and Biology, School of Biotechnology, Royal Institute of Technology, AlbaNova University Center, S-10691 Stockholm, Sweden and PDC Center for High Performance Computing, Royal Institute of Technology, S-10044 Stockholm (Sweden); Ekström, Ulf; Helgaker, Trygve [Center for Theoretical and Computational Chemistry (CTCC), Department of Chemistry, University of Oslo, P.O. Box 1033, Blindern, 0315 Oslo (Norway)

    2014-01-21

    We present the first analytic implementation of cubic and quartic force constants at the level of Kohn–Sham density-functional theory. The implementation is based on an open-ended formalism for the evaluation of energy derivatives in an atomic-orbital basis. The implementation relies on the availability of open-ended codes for evaluation of one- and two-electron integrals differentiated with respect to nuclear displacements as well as automatic differentiation of the exchange–correlation kernels. We use generalized second-order vibrational perturbation theory to calculate the fundamental frequencies of methane, ethane, benzene, and aniline, comparing B3LYP, BLYP, and Hartree–Fock results. The Hartree–Fock anharmonic corrections agree well with the B3LYP corrections when calculated at the B3LYP geometry and from B3LYP normal coordinates, suggesting that the inclusion of electron correlation is not essential for the reliable calculation of cubic and quartic force constants.

  14. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  15. Analytical methods used for the authentication of food of animal origin.

    Science.gov (United States)

    Abbas, Ouissam; Zadravec, Manuela; Baeten, Vincent; Mikuš, Tomislav; Lešić, Tina; Vulić, Ana; Prpić, Jelena; Jemeršić, Lorena; Pleadin, Jelka

    2018-04-25

    Since adulteration can have serious consequences on human health, it affects market growth by destroying consumer confidence. Therefore, authentication of food is important for food processors, retailers and consumers, but also for regulatory authorities. However, a complex nature of food and an increase in types of adulterants make their detection difficult, so that food authentication often poses a challenge. This review focuses on analytical approaches to authentication of food of animal origin, with an emphasis put on determination of specific ingredients, geographical origin and adulteration by virtue of substitution. This review highlights a current overview of the application of target approaches in cases when the compound of interest is known and non-target approaches for screening issues. Papers cited herein mainly concern milk, cheese, meat and honey. Moreover, advantages, disadvantages as well as challenges regarding the use of both approaches in official food control but also in food industry are investigated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Use of decision analytic methods in nuclear safety. An international survey

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.; Pulkkinen, U. [VTT Automation, Espoo (Finland). Industrial Automation

    1996-12-01

    This report reviews applications of formal decision analysis methods in resolving nuclear safety related issues. The review is based on selected published reports and a questionnaire sent to the members of the Principal Working Group 5 on risk analysis (PWG5) of OECD/NEA/CSNI. In the report, decision analysis methodology is shortly described. The applications discussed in this review are related to probabilistic safety goals of safety criteria, operational safety management, nuclear waste management and emergency management. The experiences from the application decision analysis methodology have been mainly positive. The advantages provided by the decision analytical thinking are the structured view over the problem under consideration and the explicit statements on uncertainties, values and preferences. The decision analysis methodology is rather mature to be applied in solution of nuclear safety issues. Although the applications have been mainly research oriented, it can be expected that the practical use of the methodology shall be more common in future. (orig.) (27 refs.).

  17. Use of the analytical tree technique to develop a radiological protection program

    International Nuclear Information System (INIS)

    Domenech N, H.; Jova S, L.

    1996-01-01

    The results obtained by the Cuban Center for Radiological Protection and Hygiene by using an analytical tree technique to develop its general operational radiation protection program are presented. By the application of this method, some factors such as the organization of the radiation protection services, the provision of administrative requirements, the existing general laboratories requirements, the viability of resources and the current documentation was evaluated. Main components were considered such as: complete normative and regulatory documentation; automatic radiological protection data management; scope of 'on the-job'and radiological protection training for the personnel; previous radiological appraisal for the safety performance of the works and application of dose constrains for the personnel and the public. The detailed development of the program allowed to identify the basic aims to be achieved in its maintenance and improvement. (authors). 3 refs

  18. Use of decision analytic methods in nuclear safety. An international survey

    International Nuclear Information System (INIS)

    Holmberg, J.; Pulkkinen, U.

    1996-12-01

    This report reviews applications of formal decision analysis methods in resolving nuclear safety related issues. The review is based on selected published reports and a questionnaire sent to the members of the Principal Working Group 5 on risk analysis (PWG5) of OECD/NEA/CSNI. In the report, decision analysis methodology is shortly described. The applications discussed in this review are related to probabilistic safety goals of safety criteria, operational safety management, nuclear waste management and emergency management. The experiences from the application decision analysis methodology have been mainly positive. The advantages provided by the decision analytical thinking are the structured view over the problem under consideration and the explicit statements on uncertainties, values and preferences. The decision analysis methodology is rather mature to be applied in solution of nuclear safety issues. Although the applications have been mainly research oriented, it can be expected that the practical use of the methodology shall be more common in future. (orig.) (27 refs.)

  19. Technological, economic and sustainability evaluation of power plants using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Chatzimouratidis, Athanasios I.; Pilavachi, Petros A.

    2009-01-01

    Complexity of power plant evaluation is steadily rising, as more criteria are involved in the overall assessment while evaluation data change rapidly. Apart from evaluating several aspects of power plants separately, a multicriteria analysis based on hierarchically structured criteria is necessary, so as to address the overall assessment of power plants according to the technological, economic and sustainability aspects. For this reason, in this paper, ten types of power plant are evaluated using nine end node criteria properly structured under the Analytical Hierarchy Process. Moreover, pairwise comparisons allow for accurate subjective criteria weighting. According to the scenario based on the subjective criteria weighting, emphasis is laid on sustainability driving renewable energy power plants at the top of the overall ranking, while nuclear and fossil fuel power plants rank in the last five positions. End node criteria contribution to each power plant and power plant performance per end node criterion is presented for all types of power plant and end node criteria. (author)

  20. Evaluation methodology for advance heat exchanger concepts using analytical hierarchy process

    International Nuclear Information System (INIS)

    Sabharwall, Piyush; Kim, Eung Soo; Patterson, Mike

    2012-01-01

    This study describes how the major alternatives and criteria being developed for the heat exchangers for next generation nuclear reactors are evaluated using the analytical hierarchy process (AHP). This evaluation was conducted as an aid in developing and selecting heat exchangers for integrating power production and process heat applications with next generation nuclear reactors. The basic setup for selecting the most appropriate heat exchanger option was established with evaluation goals, alternatives, and criteria. The two potential candidates explored in this study were shell-and-tube (helical coiled) and printed circuit heat exchangers. Based on study results, the shell-and-tube (helical coiled) heat exchanger is recommended for a demonstration reactor in the near term, mainly because of its reliability.

  1. Flow modeling in a porous cylinder with regressing walls using semi analytical approach

    Directory of Open Access Journals (Sweden)

    M Azimi

    2016-10-01

    Full Text Available In this paper, the mathematical modeling of the flow in a porous cylinder with a focus on applications to solid rocket motors is presented. As usual, the cylindrical propellant grain of a solid rocket motor is modeled as a long tube with one end closed at the headwall, while the other remains open. The cylindrical wall is assumed to be permeable so as to simulate the propellant burning and normal gas injection. At first, the problem description and formulation are considered. The Navier-Stokes equations for the viscous flow in a porous cylinder with regressing walls are reduced to a nonlinear ODE by using a similarity transformation in time and space. Application of Differential Transformation Method (DTM as an approximate analytical method has been successfully applied. Finally the results have been presented for various cases.

  2. Evaluation and selection of energy technologies using an integrated graph theory and analytic hierarchy process methods

    Directory of Open Access Journals (Sweden)

    P. B. Lanjewar

    2016-06-01

    Full Text Available The evaluation and selection of energy technologies involve a large number of attributes whose selection and weighting is decided in accordance with the social, environmental, technical and economic framework. In the present work an integrated multiple attribute decision making methodology is developed by combining graph theory and analytic hierarchy process methods to deal with the evaluation and selection of energy technologies. The energy technology selection attributes digraph enables a quick visual appraisal of the energy technology selection attributes and their interrelationships. The preference index provides a total objective score for comparison of energy technologies alternatives. Application of matrix permanent offers a better appreciation of the considered attributes and helps to analyze the different alternatives from combinatorial viewpoint. The AHP is used to assign relative weights to the attributes. Four examples of evaluation and selection of energy technologies are considered in order to demonstrate and validate the proposed method.

  3. Multiobjective Optimization in Combinatorial Wind Farms System Integration and Resistive SFCL Using Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Moghadasi, Amirhasan; Sarwat, Arif; Guerrero, Josep M.

    2016-01-01

    This paper presents a positive approach for low voltage ride-through (LVRT) improvement of the permanent magnet synchronous generator (PMSG) based on a large wind power plant (WPP) of 50MW. The proposed method utilizes the conventional current control strategy to provide a reactive power...... requirement and retain the active power production during and after the fault for the grid codes compliance. Besides that, a resistive superconducting fault current limiter (RSFCL) as an additional self-healing support is applied outside the WPP to further increase the rated active power of the installation...... on the extreme load reduction is effectively demonstrated. A large WPP has a complicated structure using several components, and the inclusion of RSFCL composes this layout more problematic for optimal performance of the system. Hence, the most-widely decision-making technique based on the analytic hierarchy...

  4. Analytical modeling of threshold voltage for Cylindrical Gate All Around (CGAA MOSFET using center potential

    Directory of Open Access Journals (Sweden)

    K.P. Pradhan

    2015-12-01

    Full Text Available In this paper, an analytical threshold voltage model is proposed for a cylindrical gate-all-around (CGAA MOSFET by solving the 2-D Poisson’s equation in the cylindrical coordinate system. A comparison is made for both the center and the surface potential model of CGAA MOSFET. This paper claims that the calculation of threshold voltage using center potential is more accurate rather than the calculation from surface potential. The effects of the device parameters like the drain bias (VDS, oxide thickness (tox, channel thickness (r, etc., on the threshold voltage are also studied in this paper. The model is verified with 3D numerical device simulator Sentaurus from Synopsys Inc.

  5. Multielement determination in Cuban red mangrove samples using nuclear and related analytical techniques

    International Nuclear Information System (INIS)

    Estevez Alvarez, J.R.; Aguiar Lambert, D.; Montero Alvarez, A.; Pupo Gonzalez, I.; Padilla Alvarez, R.; Gonzalez Garcia, H.; Ramirez Sasco, M.

    1998-01-01

    In the present work the contents of Al; K; Ca; Mn; Fe; Ni; Cu; Zn; Sr; Cd and Pb in red mangroves (Rhizophora mangle) from different Cuban regions are determined, using Energy Dispersive X-Ray Fluorescence (Emission-Transmission (Et) and I/C methods), Atomic Absorption Spectrophotometry (AAS), and Polarography (Anodic Stripping Voltametry method). Biological Certified Reference Materials (CRM) are employed for the tracing of the tracing of the curves of the relative I/C method and for the evaluation of the analytical results accuracy. The reliability of the results is also checked by statistical means. Standard deviations and the detection limits of each method are reported. Finally, the obtained values for the concentration of the different elements in each studied ecosystem are presented; a detailed discussion about their significance will be performed in a further paper

  6. Studies on Pt–Mo phases using analytical techniques with high resolution

    Energy Technology Data Exchange (ETDEWEB)

    Topic, M., E-mail: mtopic@tlabs.ac.za [iThemba LABS, National Research Foundation, P.O. Box 722, Somerset West 7129 (South Africa); Khumalo, Z. [iThemba LABS, National Research Foundation, P.O. Box 722, Somerset West 7129 (South Africa); University of Cape Town, Physics Department, Private Bag X3, Rondebosch 7701 (South Africa); Pineda-Vargas, C.A. [iThemba LABS, National Research Foundation, P.O. Box 722, Somerset West 7129 (South Africa); Faculty of Health and Wellness Sciences, CPUT, Belville (South Africa)

    2014-01-01

    Pt–Mo coated system annealed at 1050 °C for 24 h was investigated using several analytical techniques with high resolution (SEM/EDX, μ-PIXE, RBS and XRD). These techniques provide structural and compositional data throughout the material depth and probing area. The results depend on the applied beam, its energy and size. They contribute to a better understanding of thermal annealing effects on the solid-state phase transformation and morphological changes in Pt–Mo coatings. The results indicate the presence of Pt- and Mo-solid solutions and two Pt–Mo phases (PtMo and Pt{sub 2}Mo{sub 3}), changes in the coating morphology, such as increased surface roughness and formation of “lace morphology”, as well as an increase in coating thickness.

  7. Decision Support System for Determining Scholarship Selection using an Analytical Hierarchy Process

    Science.gov (United States)

    Puspitasari, T. D.; Sari, E. O.; Destarianto, P.; Riskiawan, H. Y.

    2018-01-01

    Decision Support System is a computer program application that analyzes data and presents it so that users can make decision more easily. Determining Scholarship Selection study case in Senior High School in east Java wasn’t easy. It needed application to solve the problem, to improve the accuracy of targets for prospective beneficiaries of poor students and to speed up the screening process. This research will build system uses the method of Analytical Hierarchy Process (AHP) is a method that solves a complex and unstructured problem into its group, organizes the groups into a hierarchical order, inputs numerical values instead of human perception in comparing relative and ultimately with a synthesis determined elements that have the highest priority. The accuracy system for this research is 90%.

  8. Theoretical and Experimental Study of Optical Coherence Tomography (OCT) Signals Using an Analytical Transport Model

    International Nuclear Information System (INIS)

    Vazquez Villa, A.; Delgado Atencio, J. A.; Vazquez y Montiel, S.; Cunill Rodriguez, M.; Martinez Rodriguez, A. E.; Ramos, J. Castro; Villanueva, A.

    2010-01-01

    Optical coherence tomography (OCT) is a non-invasive low coherent interferometric technique that provides cross-sectional images of turbid media. OCT is based on the classical Michelson interferometer where the mirror of the reference arm is oscillating and the signal arm contains a biological sample. In this work, we analyzed theoretically the heterodyne optical signal adopting the so called extended Huygens-Fresnel principle (EHFP). We use simulated OCT images with known optical properties to test an algorithm developed by ourselves to recover the scattering coefficient and we recovered the scattering coefficient with a relative error less than 5% for noisy signals. In addition, we applied this algorithm to OCT images from phantoms of known optical properties; in this case curves were indistinguishable. A revision of the validity of the analytical model applied to our system should be done.

  9. Analytical modeling for fractional multi-dimensional diffusion equations by using Laplace transform

    Directory of Open Access Journals (Sweden)

    Devendra Kumar

    2015-01-01

    Full Text Available In this paper, we propose a simple numerical algorithm for solving multi-dimensional diffusion equations of fractional order which describes density dynamics in a material undergoing diffusion by using homotopy analysis transform method. The fractional derivative is described in the Caputo sense. This homotopy analysis transform method is an innovative adjustment in Laplace transform method and makes the calculation much simpler. The technique is not limited to the small parameter, such as in the classical perturbation method. The scheme gives an analytical solution in the form of a convergent series with easily computable components, requiring no linearization or small perturbation. The numerical solutions obtained by the proposed method indicate that the approach is easy to implement and computationally very attractive.

  10. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    Directory of Open Access Journals (Sweden)

    Lakshmi Narayana Suvarapu

    2015-01-01

    Full Text Available This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed.

  11. Hair elemental analysis for forensic science using nuclear and related analytical methods

    Czech Academy of Sciences Publication Activity Database

    Kučera, Jan; Kameník, Jan; Havránek, Vladimír

    2018-01-01

    Roč. 7, č. 3 (2018), s. 65-74 ISSN 2468-1709 R&D Projects: GA ČR(CZ) GBP108/12/G108; GA MŠk LM2015056 Institutional support: RVO:61389005 Keywords : hair * forensic analysis * neutron activation analysis * particle induced X-ray emission Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry

  12. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  13. Determining passive cooling limits in CPV using an analytical thermal model

    Science.gov (United States)

    Gualdi, Federico; Arenas, Osvaldo; Vossier, Alexis; Dollet, Alain; Aimez, Vincent; Arès, Richard

    2013-09-01

    We propose an original thermal analytical model aiming to predict the practical limits of passive cooling systems for high concentration photovoltaic modules. The analytical model is described and validated by comparison with a commercial 3D finite element model. The limiting performances of flat plate cooling systems in natural convection are then derived and discussed.

  14. In Situ Analytical Characterization of Contaminated Sites Using Nuclear Spectrometry Techniques. Review of Methodologies and Measurements

    International Nuclear Information System (INIS)

    2017-01-01

    Past and current human activities can result in the contamination of sites by radionuclides and heavy metals. The sources of contamination are various. The most important sources for radionuclide release include global fallout from nuclear testing, nuclear and radiological accidents, waste production from nuclear facilities, and activities involving naturally occurring radioactive material (NORM). Contamination of the environment by heavy metals mainly originates from industrial applications and mineralogical background concentration. Contamination of sites by radionuclides and heavy metals can present a risk to people and the environment. Therefore, the estimation of the contamination level and the identification of the source constitute important information for the national authorities with the responsibility to protect people and the environment from adverse health effects. In situ analytical techniques based on nuclear spectrometry are important tools for the characterization of contaminated sites. Much progress has been made in the design and implementation of portable systems for efficient and effective monitoring of radioactivity and heavy metals in the environment directly on-site. Accordingly, the IAEA organized a Technical Meeting to review the current status and trends of various applications of in situ nuclear spectrometry techniques for analytical characterization of contaminated sites and to support Member States in their national environmental monitoring programmes applying portable instrumentation. This publication represents a comprehensive review of the in situ gamma ray spectrometry and field portable X ray fluorescence analysis techniques for the characterization of contaminated sites. It includes papers on the use of these techniques, which provide useful background information for conducting similar studies, in the following Member States: Argentina, Australia, Brazil, Czech Republic, Egypt, France, Greece, Hungary, Italy, Lithuania

  15. Manufacturing plant location selection in logistics network using Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Ping-Yu Chang

    2015-11-01

    Full Text Available Purpose: In recent years, numerous companies have moved their manufacturing plants to China to capitalize on lower cost and tax. Plant location has such an impact on cost, stocks, and logistics network but location selection in the company is usually based on subjective preference of high ranking managers. Such a decision-making process might result in selecting a location with a lower fixed cost but a higher operational cost. Therefore, this research adapts real data from an electronics company to develop a framework that incorporates both quantitative and qualitative factors for selecting new plant locations. Design/methodology/approach: In-depth interviews were conducted with 12 high rank managers (7 of them are department manager, 2 of them are vice-president, 1 of them is senior engineer, and 2 of them are plant manager in the departments of construction, finance, planning, production, and warehouse to determine the important factors. A questionnaire survey is then conducted for comparing factors which are analyzed using the Analytic Hierarchy Process (AHP. Findings: Results show that the best location chosen by the developed framework coincides well with the company’s primal production base. The results have been presented to the company’s high ranking managers for realizing the accuracy of the framework. Positive responses of the managers indicate usefulness of implementing the proposed model into reality, which adds to the value of this research. Practical implications: The proposed framework can save numerous time-consuming meetings called to compromise opinions and conflictions from different departments in location selection. Originality/value: This paper adapts the Analytic Hierarchy Process (AHP to incorporate quantitative and qualitative factors which are obtained through in-depth interviews with high rank managers in a company into the location decision.

  16. Analytical Propagation of Uncertainty in Life Cycle Assessment Using Matrix Formulation

    DEFF Research Database (Denmark)

    Imbeault-Tétreault, Hugues; Jolliet, Olivier; Deschênes, Louise

    2013-01-01

    with Monte Carlo results. The sensitivity and contribution of input parameters to output uncertainty were also analytically calculated. This article outlines an uncertainty analysis of the comparison between two case study scenarios. We conclude that the analytical method provides a good approximation...... on uncertainty calculation. This article shows the importance of the analytical method in uncertainty calculation, which could lead to a more complete uncertainty analysis in LCA practice....... uncertainty assessment is not a regular step in LCA. An analytical approach based on Taylor series expansion constitutes an effective means to overcome the drawbacks of the Monte Carlo method. This project aimed to test the approach on a real case study, and the resulting analytical uncertainty was compared...

  17. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  18. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  19. Analytical characterization of polymers used in conservation and restoration by ATR-FTIR spectroscopy.

    Science.gov (United States)

    Chércoles Asensio, Ruth; San Andrés Moya, Margarita; de la Roja, José Manuel; Gómez, Marisa

    2009-12-01

    In the last few decades many new polymers have been synthesized that are now being used in cultural heritage conservation. The physical and chemical properties and the long-term behaviors of these new polymers are determined by the chemical composition of the starting materials used in their synthesis along with the nature of the substances added to facilitate their production. The practical applications of these polymers depend on their composition and form (foam, film, sheets, pressure-sensitive adhesives, heat-seal adhesives, etc.). Some materials are used in restoration works and others for the exhibition, storage and transport of works of art. In all cases, it is absolutely necessary to know their compositions. Furthermore, many different materials that are manufactured for other objectives are also used for conservation and restoration. The technical information about the materials provided by the manufacturer is usually incomplete, so it is necessary to analytically characterize such materials. FTIR spectrometry is widely used for polymer identification, and, more recently, ATR-FTIR has been shown to give excellent results. This paper reports the ATR-FTIR analysis of samples of polymeric materials used in the conservation of artworks. These samples were examined directly in the solid material without sample preparation.

  20. Analytical solutions of linked fault tree probabilistic risk assessments using binary decision diagrams with emphasis on nuclear safety applications

    International Nuclear Information System (INIS)

    Nusbaumer, O. P. M.

    2007-01-01

    approximated results to exact BDD results. The comparison shows that the classical approach produces accurate results for internal event assessments, but fails for external event assessments or Level 2 PRA, where the probability values are typically much higher. The analytical quantification of large linked fault tree models using BDDs requires complex algorithms and programming techniques, which have been evaluated for the first time on a fullscope PRA model in this study. This study demonstrated the feasibility of implementing BDDs for the analytical quantification of large fault tree models as found in the nuclear industry. The implementation of BDD turns out to be the most promising approach for analytical fault tree model solving. This important insight should be put in focus when considering the increasing demand on PRA related applications, such as risk-informed decision making in modern industries and services. (author)

  1. Preliminary analytical study on the feasibility of using reinforced concrete pile foundations for renewable energy storage by compressed air energy storage technology

    Science.gov (United States)

    Tulebekova, S.; Saliyev, D.; Zhang, D.; Kim, J. R.; Karabay, A.; Turlybek, A.; Kazybayeva, L.

    2017-11-01

    Compressed air energy storage technology is one of the promising methods that have high reliability, economic feasibility and low environmental impact. Current applications of the technology are mainly limited to energy storage for power plants using large scale underground caverns. This paper explores the possibility of making use of reinforced concrete pile foundations to store renewable energy generated from solar panels or windmills attached to building structures. The energy will be stored inside the pile foundation with hollow sections via compressed air. Given the relatively small volume of storage provided by the foundation, the required storage pressure is expected to be higher than that in the large-scale underground cavern. The high air pressure typically associated with large temperature increase, combined with structural loads, will make the pile foundation in a complicated loading condition, which might cause issues in the structural and geotechnical safety. This paper presents a preliminary analytical study on the performance of the pile foundation subjected to high pressure, large temperature increase and structural loads. Finite element analyses on pile foundation models, which are built from selected prototype structures, have been conducted. The analytical study identifies maximum stresses in the concrete of the pile foundation under combined pressure, temperature change and structural loads. Recommendations have been made for the use of reinforced concrete pile foundations for renewable energy storage.

  2. No Impact of the Analytical Method Used for Determining Cystatin C on Estimating Glomerular Filtration Rate in Children.

    Science.gov (United States)

    Alberer, Martin; Hoefele, Julia; Benz, Marcus R; Bökenkamp, Arend; Weber, Lutz T

    2017-01-01

    Measurement of inulin clearance is considered to be the gold standard for determining kidney function in children, but this method is time consuming and expensive. The glomerular filtration rate (GFR) is on the other hand easier to calculate by using various creatinine- and/or cystatin C (Cys C)-based formulas. However, for the determination of serum creatinine (Scr) and Cys C, different and non-interchangeable analytical methods exist. Given the fact that different analytical methods for the determination of creatinine and Cys C were used in order to validate existing GFR formulas, clinicians should be aware of the type used in their local laboratory. In this study, we compared GFR results calculated on the basis of different GFR formulas and either used Scr and Cys C values as determined by the analytical method originally employed for validation or values obtained by an alternative analytical method to evaluate any possible effects on the performance. Cys C values determined by means of an immunoturbidimetric assay were used for calculating the GFR using equations in which this analytical method had originally been used for validation. Additionally, these same values were then used in other GFR formulas that had originally been validated using a nephelometric immunoassay for determining Cys C. The effect of using either the compatible or the possibly incompatible analytical method for determining Cys C in the calculation of GFR was assessed in comparison with the GFR measured by creatinine clearance (CrCl). Unexpectedly, using GFR equations that employed Cys C values derived from a possibly incompatible analytical method did not result in a significant difference concerning the classification of patients as having normal or reduced GFR compared to the classification obtained on the basis of CrCl. Sensitivity and specificity were adequate. On the other hand, formulas using Cys C values derived from a compatible analytical method partly showed insufficient

  3. Markov-CA model using analytical hierarchy process and multiregression technique

    International Nuclear Information System (INIS)

    Omar, N Q; Sanusi, S A M; Hussin, W M W; Samat, N; Mohammed, K S

    2014-01-01

    The unprecedented increase in population and rapid rate of urbanisation has led to extensive land use changes. Cellular automata (CA) are increasingly used to simulate a variety of urban dynamics. This paper introduces a new CA based on an integration model built-in multi regression and multi-criteria evaluation to improve the representation of CA transition rule. This multi-criteria evaluation is implemented by utilising data relating to the environmental and socioeconomic factors in the study area in order to produce suitability maps (SMs) using an analytical hierarchical process, which is a well-known method. Before being integrated to generate suitability maps for the periods from 1984 to 2010 based on the different decision makings, which have become conditioned for the next step of CA generation. The suitability maps are compared in order to find the best maps based on the values of the root equation (R 2 ). This comparison can help the stakeholders make better decisions. Thus, the resultant suitability map derives a predefined transition rule for the last step for CA model. The approach used in this study highlights a mechanism for monitoring and evaluating land-use and land-cover changes in Kirkuk city, Iraq owing changes in the structures of governments, wars, and an economic blockade over the past decades. The present study asserts the high applicability and flexibility of Markov-CA model. The results have shown that the model and its interrelated concepts are performing rather well

  4. Quantitative microanalysis in the analytical electronmicroscope using an HPGe-x ray detector

    International Nuclear Information System (INIS)

    Grogger, W.

    1994-01-01

    Energy dispersive x-ray spectrometry (EDX) is a routine method for determining the chemical composition of a sample in the analytical electronmicroscope. Since some years high purity germanium x-ray detectors (HPGe) are commercially available for use in EDX. This new type of detector offers some advantages over the commonly used Si (Li) detector: better energy resolution, better detector efficiency for high energy lines (> 30 keV) and better stability against exterior influences. For quantitative analysis one needs sensitivity factors (k-factors), which correlate the measured intensity to the concentration of a specific element. These k-factors can be calculated or determined experimentally. For a precise quantitative analysis of light elements measured k-factors are absolutely necessary. In this study k-factors were measured with an HPGe detector using standards. The accuracy of the k-factors was proved using some examples of practical relevance. Additionally some special features of the HPGe detector were examined, which lead to a better understanding of EDX spectrometry using an HPGe detector (escape lines, icing of the detector, artifacts). (author)

  5. A behavior analytic analogue of learning to use synonyms, syntax, and parts of speech.

    Science.gov (United States)

    Chase, Philip N; Ellenwood, David W; Madden, Gregory

    2008-01-01

    Matching-to-sample and sequence training procedures were used to develop responding to stimulus classes that were considered analogous to 3 aspects of verbal behavior: identifying synonyms and parts of speech, and using syntax. Matching-to-sample procedures were used to train 12 paired associates from among 24 stimuli. These pairs were analogous to synonyms. Then, sequence characteristics were trained to 6 of the stimuli. The result was the formation of 3 classes of 4 stimuli, with the classes controlling a sequence response analogous to a simple ordering syntax: first, second, and third. Matching-to-sample procedures were then used to add 4 stimuli to each class. These stimuli, without explicit sequence training, also began to control the same sequence responding as the other members of their class. Thus, three 8-member functionally equivalent sequence classes were formed. These classes were considered to be analogous to parts of speech. Further testing revealed three 8-member equivalence classes and 512 different sequences of first, second, and third. The study indicated that behavior analytic procedures may be used to produce some generative aspects of verbal behavior related to simple syntax and semantics.

  6. Modeling of Coaxial Slot Waveguides Using Analytical and Numerical Approaches: Revisited

    Directory of Open Access Journals (Sweden)

    Kok Yeow You

    2012-01-01

    Full Text Available Our reviews of analytical methods and numerical methods for coaxial slot waveguides are presented. The theories, background, and physical principles related to frequency-domain electromagnetic equations for coaxial waveguides are reassessed. Comparisons of the accuracies of various types of admittance and impedance equations and numerical simulations are made, and the fringing field at the aperture sensor, which is represented by the lumped capacitance circuit, is evaluated. The accuracy and limitations of the analytical equations are explained in detail. The reasons for the replacement of analytical methods by numerical methods are outlined.

  7. Analytical solutions for Dirac and Klein-Gordon equations using Backlund transformations

    Energy Technology Data Exchange (ETDEWEB)

    Zabadal, Jorge R.; Borges, Volnei, E-mail: jorge.zabadal@ufrgs.br, E-mail: borges@ufrgs.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Dept. de Engenharia Mecanica; Ribeiro, Vinicius G., E-mail: vinicius_ribeiro@uniritter.edu.br [Centro Universitario Ritter dos Reis (UNIRITTER), Porto Alegre, RS (Brazil); Santos, Marcio, E-mail: marciophd@gmail.com [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Centro de Estudos Interdisciplinares

    2015-07-01

    This work presents a new analytical method for solving Klein-Gordon type equations via Backlund transformations. The method consists in mapping the Klein-Gordon model into a first order system of partial differential equations, which contains a generalized velocity field instead of the Dirac matrices. This system is a tensor model for quantum field theory whose space solution is wider than the Dirac model in the original form. Thus, after finding analytical expressions for the wave functions, the Maxwell field can be readily obtained from the Dirac equations, furnishing a self-consistent field solution for the Maxwell-Dirac system. Analytical and numerical results are reported. (author)

  8. Freedom: A Promise of Possibility.

    Science.gov (United States)

    Bunkers, Sandra Schmidt

    2015-10-01

    The idea of freedom as a promise of possibility is explored in this column. The core concepts from a research study on considering tomorrow (Bunkers, 1998) coupled with humanbecoming community change processes (Parse, 2003) are used to illuminate this notion. The importance of intentionality in human freedom is discussed from both a human science and a natural science perspective. © The Author(s) 2015.

  9. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  10. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    Directory of Open Access Journals (Sweden)

    Dillon Chrimes

    2017-01-01

    Full Text Available Big data analytics (BDA is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS using HBase (key-value NoSQL database. Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB and a month for three billion (30 TB indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  11. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services.

    Science.gov (United States)

    Chrimes, Dillon; Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  12. MUNICIPAL LANDFILL SITE SELECTION FOR ISFAHAN CITY BY USE OF FUZZY LOGIC AND ANALYTIC HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    A. Afzali

    2011-09-01

    Full Text Available Selecting the most suitable site for landfill can avoid any ecological and socio-economical effects. The increase in industrial and economical development along with the increase of population growth in Isfahan city generates tremendous amount of solid waste within the region. Factors such as the scarcity of land, life span of landfill, and environmental considerations warrant that the scientific and fundamental studies are carried in selecting the suitability of a landfill site. The analysis of spatial data and consideration of regulations, and accepted criteria are part of the important elements in the site selection. The present study presents a multi criteria evaluation method using GIS technique for landfill suitability site evaluation. The Analytic Hierarchy Process (AHP was used for weighing the information layers. By using the fuzzy logic method (classification of suitable areas in the range of 0 to 255 byte scale the superposing of the information layers related to topography, soil, water table, sensitive ecosystems, land use and geology maps was performed in the study. Only after omission of inappropriate areas, the suitability examination of the residue areas was accomplished. The application of the present method in Isfahan city shows approximately 5% of the south east and north east parts of the study area with the value of more than 220 byte scale, which are suitable for landfill establishment.

  13. Use of the analytic hierarchy process for medication decision-making in type 2 diabetes.

    Directory of Open Access Journals (Sweden)

    Nisa M Maruthur

    Full Text Available To investigate the feasibility and utility of the Analytic Hierarchy Process (AHP for medication decision-making in type 2 diabetes.We conducted an AHP with nine diabetes experts using structured interviews to rank add-on therapies (to metformin for type 2 diabetes. During the AHP, participants compared treatment alternatives relative to eight outcomes (hemoglobin A1c-lowering and seven potential harms and the relative importance of the different outcomes. The AHP model and instrument were pre-tested and pilot-tested prior to use. Results were discussed and an evaluation of the AHP was conducted during a group session. We conducted the quantitative analysis using Expert Choice software with the ideal mode to determine the priority of treatment alternatives.Participants judged exenatide to be the best add-on therapy followed by sitagliptin, sulfonylureas, and then pioglitazone. Maximizing benefit was judged 21% more important than minimizing harm. Minimizing severe hypoglycemia was judged to be the most important harm to avoid. Exenatide was the best overall alternative if the importance of minimizing harms was prioritized completely over maximizing benefits. Participants reported that the AHP improved transparency, consistency, and an understanding of others' perspectives and agreed that the results reflected the views of the group.The AHP is feasible and useful to make decisions about diabetes medications. Future studies which incorporate stakeholder preferences should evaluate other decision contexts, objectives, and treatments.

  14. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    Science.gov (United States)

    Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services. PMID:29375652

  15. Analytical evidences of the use of iron-gall ink as a pigment on miniature paintings

    Science.gov (United States)

    Aceto, Maurizio; Calà, Elisa

    2017-12-01

    Iron-gall ink (IGI) has been used by scribes for writing since at least the 4th century CE. Another typical use of this ink was for drawing: many Old Masters created beautiful sketches in brown-black hues. Despite its widespread use to draw lines, it seems like IGI was hardly used for painting as well. In fact, the number of identification on manuscripts is very low at present. This could be partially due to a lack of reliable diagnostic information. In this work we tried to better define the possibility of identifying IGI as a pigment on illuminate manuscripts, evaluating the pros and cons of three different techniques: UV-visible diffuse reflectance spectrophotometry with optic fibres (FORS), Raman spectroscopy and XRF spectrometry. With concern to in situ non-invasive analysis, Raman spectroscopy has the best diagnostic power but FORS seems to provide the better compromise between selectivity and ease of application. Moreover, new analytical evidences was given on the particular use of IGI by ancient illuminators: a non-invasive and micro-invasive diagnostic survey on Western manuscripts datable in the range 6-16th centuries was carried out showing that, apart from its widespread use as an ink for writing and drawing, IGI was largely used as a pigment too. The large number of identification obtained allows us to hypothesise that this pigment was used all through medieval Europe up to at least the Renaissance, where its use is already documented in drawing. The occurrence of IGI in miniature paintings older than 6th century or more recent than 16th century cannot be excluded, as is its use beyond Europe; further measurements could instead widen the time range and the geographic area. Nevertheless, the present study allows shedding a new light on the use of this colourant all along the period of medieval and Renaissance miniature painting art.

  16. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadkhaniha, Reza; Shafiee, Abbas [Department of Medicinal Chemistry, Faculty of Pharmacy and Pharmaceutical Sciences Research Center, Tehran University of Medical Sciences, Tehran 14174 (Iran, Islamic Republic of); Rastkari, Noushin [Center for Environmental Research, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kobarfard, Farzad [Department of Medicinal Chemistry, School of Pharmacy, Shaheed Beheshti University of Medical Sciences, Tavaneer Ave., Valieasr St., Tehran (Iran, Islamic Republic of)], E-mail: farzadkf@yahoo.com

    2009-01-05

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis.

  17. Scaling in situ cosmogenic nuclide production rates using analytical approximations to atmospheric cosmic-ray fluxes

    Science.gov (United States)

    Lifton, Nathaniel; Sato, Tatsuhiko; Dunai, Tibor J.

    2014-01-01

    Several models have been proposed for scaling in situ cosmogenic nuclide production rates from the relatively few sites where they have been measured to other sites of interest. Two main types of models are recognized: (1) those based on data from nuclear disintegrations in photographic emulsions combined with various neutron detectors, and (2) those based largely on neutron monitor data. However, stubborn discrepancies between these model types have led to frequent confusion when calculating surface exposure ages from production rates derived from the models. To help resolve these discrepancies and identify the sources of potential biases in each model, we have developed a new scaling model based on analytical approximations to modeled fluxes of the main atmospheric cosmic-ray particles responsible for in situ cosmogenic nuclide production. Both the analytical formulations and the Monte Carlo model fluxes on which they are based agree well with measured atmospheric fluxes of neutrons, protons, and muons, indicating they can serve as a robust estimate of the atmospheric cosmic-ray flux based on first principles. We are also using updated records for quantifying temporal and spatial variability in geomagnetic and solar modulation effects on the fluxes. A key advantage of this new model (herein termed LSD) over previous Monte Carlo models of cosmogenic nuclide production is that it allows for faster estimation of scaling factors based on time-varying geomagnetic and solar inputs. Comparing scaling predictions derived from the LSD model with those of previously published models suggest potential sources of bias in the latter can be largely attributed to two factors: different energy responses of the secondary neutron detectors used in developing the models, and different geomagnetic parameterizations. Given that the LSD model generates flux spectra for each cosmic-ray particle of interest, it is also relatively straightforward to generate nuclide-specific scaling

  18. Metal artifact reduction in x-ray computed tomography by using analytical DBP-type algorithm

    Science.gov (United States)

    Wang, Zhen; Kudo, Hiroyuki

    2012-03-01

    This paper investigates a common metal artifacts problem in X-ray computed tomography (CT). The artifacts in reconstructed image may render image non-diagnostic because of inaccuracy beam hardening correction from high attenuation objects, satisfactory image could not be reconstructed from projections with missing or distorted data. In traditionally analytical metal artifact reduction (MAR) method, firstly subtract the metallic object part of projection data from the original obtained projection, secondly complete the subtracted part in original projection by using various interpolating method, thirdly reconstruction from the interpolated projection by filtered back-projection (FBP) algorithm. The interpolation error occurred during the second step can make unrealistic assumptions about the missing data, leading to DC shift artifact in the reconstructed images. We proposed a differentiated back-projection (DBP) type MAR method by instead of FBP algorithm with DBP algorithm in third step. In FBP algorithm the interpolated projection will be filtered on each projection view angle before back-projection, as a result the interpolation error is propagated to whole projection. However, the property of DBP algorithm provide a chance to do filter after the back-projection in a Hilbert filter direction, as a result the interpolation error affection would be reduce and there is expectation on improving quality of reconstructed images. In other word, if we choose the DBP algorithm instead of the FBP algorithm, less contaminated projection data with interpolation error would be used in reconstruction. A simulation study was performed to evaluate the proposed method using a given phantom.

  19. A CRITICAL STUDY AND COMPARISON OF MANUFACTURING SIMULATION SOFTWARES USING ANALYTIC HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    ASHU GUPTA

    2010-03-01

    Full Text Available In a period of continuous change in global business environment, organizations, large and small, are finding it increasingly difficult to deal with, and adjust to the demands for such change. Simulation is a powerful tool for allowing designers imagines new systems and enabling them to both quantify and observe behavior. Currently the market offers a variety of simulation software packages. Some are less expensive than others. Some are generic and can be used in a wide variety of application areas while others are more specific. Some have powerful features for modeling while others provide only basic features. Modeling approaches and strategies are different for different packages. Companies are seeking advice about the desirable features of software for manufacturing simulation, depending on the purpose of its use. Because of this, the importance of an adequate approach to simulation software evaluation and comparison is apparent. This paper presents a critical evaluation of four widely used manufacturing simulators: NX-IDEAS, Star-CD, Micro Saint Sharp and ProModel. Following a review of research into simulation software evaluation, an evaluation and comparison of the above simulators is performed. This paper illustrates and assesses the role the Analytic Hierarchy Process (AHP played in simulation software evaluation and selection. The main purpose of this evaluation and comparison is to discover the suitability of certain types of simulators for particular purposes.

  20. [Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].

    Science.gov (United States)

    Tanaka, Koichi

    2016-02-01

    Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front.

  1. Assessment of shrimp farming impact on groundwater quality using analytical hierarchy process

    Science.gov (United States)

    Anggie, Bernadietta; Subiyanto, Arief, Ulfah Mediaty; Djuniadi

    2018-03-01

    Improved shrimp farming affects the groundwater quality conditions. Assessment of shrimp farming impact on groundwater quality conventionally has less accuracy. This paper presents the implementation of Analytical Hierarchy Process (AHP) method for assessing shrimp farming impact on groundwater quality. The data used is the impact data of shrimp farming in one of the regions in Indonesia from 2006-2016. Criteria used in this study were 8 criteria and divided into 49 sub-criteria. The weighting by AHP performed to determine the importance level of criteria and sub-criteria. Final priority class of shrimp farming impact were obtained from the calculation of criteria's and sub-criteria's weights. The validation was done by comparing priority class of shrimp farming impact and water quality conditions. The result show that 50% of the total area was moderate priority class, 37% was low priority class and 13% was high priority class. From the validation result impact assessment for shrimp farming has been high accuracy to the groundwater quality conditions. This study shows that assessment based on AHP has a higher accuracy to shrimp farming impact and can be used as the basic fisheries planning to deal with impacts that have been generated.

  2. Dynamical nuclear safeguard investigations in nuclear materials using Analytic Pair Values

    International Nuclear Information System (INIS)

    Woo, Tae-Ho

    2011-01-01

    Highlights: → The quantification of the safeguard is performed to enhance operation safety. → Newly introduced maximum pair values with multiplications are obtained by the AHP method. → The dynamical simulations are performed based on the energy policy aspect. → The comparisons using NSP are possible. → A better operation skill is developed. - Abstract: The operation of nuclear power plants (NPPs) has been investigated from the view point of safeguard assessment. The risk of terrorist attack on NPPs is one of the critical points in the secure plant operations. The basic event of the related incidents is quantified by the random sampling using a Monte-Carlo method. The Analytic Hierarchy Process (AHP) is developed leading to the maximum pair values with multiplications which are decided by reactor characteristics. The matrix form analysis is compared with five NPP types of interest. Using a life cycle of 60 years, the range of the secure operation is between 0.020628 and 0.0212986, as relative numbers. This means the highest value in the range of secure power operation is about 1.043 times larger than the lowest one in this study. The consistency has the highest consistent values in the 24th and 54th years, as represented by C.I. (Consistency Index) and C.R. (Consistency Ratio). Finally, a nuclear safeguard protocol (NSP) is successfully constructed for the safe operation.

  3. 237 Np analytical method using 239 Np tracers and application to a contaminated nuclear disposal facility

    Energy Technology Data Exchange (ETDEWEB)

    Snow, Mathew S.; Morrison, Samuel S.; Clark, Sue B.; Olson, John E.; Watrous, Matthew G.

    2017-06-01

    Environmental 237Np analyses are challenged by low 237Np concentrations and lack of an available yield tracer; we report a rapid, inexpensive 237Np analytical approach employing the short lived 239Np (t1/2 = 2.3 days) as a chemical yield tracer followed by 237Np quantification using inductively coupled plasma-mass spectrometry. 239Np tracer is obtained via separation from a 243Am stock solution and standardized using gamma spectrometry immediately prior to sample processing. Rapid digestions using a commercial, 900 watt “Walmart” microwave and Parr microwave vessels result in 99.8 ± 0.1% digestion yields, while chromatographic separations enable Np/U separation factors on the order of 106 and total Np yields of 95 ± 4% (2σ). Application of this method to legacy soil samples surrounding a radioactive disposal facility (the Subsurface Disposal Area at Idaho National Laboratory) reveal the presence of low level 237Np contamination within 600 meters of this site, with maximum 237Np concentrations on the order of 103 times greater than nuclear weapons testing fallout levels.

  4. A Model for the Development of Hospital Beds Using Fuzzy Analytical Hierarchy Process (Fuzzy AHP).

    Science.gov (United States)

    Ravangard, Ramin; Bahadori, Mohammadkarim; Raadabadi, Mehdi; Teymourzadeh, Ehsan; Alimomohammadzadeh, Khalil; Mehrabian, Fardin

    2017-11-01

    This study aimed to identify and prioritize factors affecting the development of military hospital beds and provide a model using fuzzy analytical hierarchy process (Fuzzy AHP). This applied study was conducted in 2016 in Iran using a mixed method. The sample included experts in the field of military health care system. The MAXQDA 10.0 and Expert Choice 10.0 software were used for analyzing the collected data. Geographic situation, demographic status, economic status, health status, health care centers and organizations, financial and human resources, laws and regulations and by-laws, and the military nature of service recipients had effects on the development of military hospital beds. The military nature of service recipients (S=0.249) and economic status (S=0.040) received the highest and lowest priorities, respectively. Providing direct health care services to the military forces in order to maintain their dignity, and according to its effects in the crisis, as well as the necessity for maintaining the security of the armed forces, and the hospital beds per capita based on the existing laws, regulations and bylaws are of utmost importance.

  5. Determinants of Customers’ Satisfaction in the Nigerian Aviation Industry Using Analytic Hierarchy Process (AHP Model

    Directory of Open Access Journals (Sweden)

    B. E A. Oghojafor

    2014-08-01

    Full Text Available The aviation industry in Africa‟s most populous nation has been experiencing an explosive growth in recent years with older domestic operators fighting competing new players. The expansion has given Nigerians a wider choice of airlines, many of them flying with new or recently refurbished aircraft, which have helped reverse the country‟s situation for air safety in the wake of a spate of crashes six years ago. This paper applied the Analytic Hierarchy Process to identify the determinants of customers‟ satisfaction in the Nigerian aviation industry. To achieve this aim, a sample of 100 customers were drawn from among customers (air passengers at the Muritala Mohammed Airport 2 in Lagos, Nigeria, using convenience sampling and snowballing techniques. The quantitative approach was used to analysed the data obtained by using descriptive statistics and the Expert Choice 2000 a software designed to analyse AHP data. Findings show that customers of the aviation industry players derived their satisfaction when operators respond quickly to their requests and provides information in relation to their flights. Although there is little relative preference in terms of customers‟ satisfaction regarding the services provided by the aviation operators in Nigeria, customers‟ satisfaction is derived essentially from how the operators handle their ticketing and reservation services.

  6. 237Np analytical method using 239Np tracers and application to a contaminated nuclear disposal facility.

    Science.gov (United States)

    Snow, Mathew S; Morrison, Samuel S; Clark, Sue B; Olson, John E; Watrous, Matthew G

    2017-06-01

    Environmental 237 Np analyses are challenged by low 237 Np concentrations and lack of an available yield tracer; we report a rapid, inexpensive 237 Np analytical approach employing the short lived 239 Np (t 1/2  = 2.3 days) as a chemical yield tracer followed by 237 Np quantification using inductively coupled plasma-mass spectrometry. 239 Np tracer is obtained via separation from a 243 Am stock solution and standardized using gamma spectrometry immediately prior to sample processing. Rapid digestions using a commercial, 900 W "Walmart" microwave and Parr microwave vessels result in 99.8 ± 0.1% digestion yields, while chromatographic separations enable Np/U separation factors on the order of 10 6 and total Np yields of 95 ± 4% (2σ). Application of this method to legacy soil samples surrounding a radioactive disposal facility (the Subsurface Disposal Area at Idaho National Laboratory) reveal the presence of low level 237 Np contamination within 600 m of this site, with maximum 237 Np concentrations on the order of 10 3 times greater than nuclear weapons testing fallout levels. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Optimization of offshore wind turbine support structures using analytical gradient-based method

    OpenAIRE

    Chew, Kok Hon; Tai, Kang; Ng, E.Y.K.; Muskulus, Michael

    2015-01-01

    Design optimization of the offshore wind turbine support structure is an expensive task; due to the highly-constrained, non-convex and non-linear nature of the design problem. This report presents an analytical gradient-based method to solve this problem in an efficient and effective way. The design sensitivities of the objective and constraint functions are evaluated analytically while the optimization of the structure is performed, subject to sizing, eigenfrequency, extreme load an...

  8. Evaluation and selection of in-situ leaching mining method using analytic hierarchy process

    International Nuclear Information System (INIS)

    Zhao Heyong; Tan Kaixuan; Liu Huizhen

    2007-01-01

    According to the complicated conditions and main influence factors of in-situ leaching min- ing, a model and processes of analytic hierarchy are established for evaluation and selection of in-situ leaching mining methods based on analytic hierarchy process. Taking a uranium mine in Xinjiang of China for example, the application of this model is presented. The results of analyses and calculation indicate that the acid leaching is the optimum project. (authors)

  9. Big Data and Predictive Analytics in Health Care.

    Science.gov (United States)

    Dhar, Vasant

    2014-09-01

    Predictive analytics show great promise in health care but face some serious hurdles for widespread adoption. I discuss the state of the art of predictive health-care analytics using the clinical arena as an example and discuss how the outputs of predictive systems could be made actionable through differentiated processes that encourage prevention. Such systems have the potential to minimize health risk at the population and individual levels through more personalized health-care delivery.

  10. Examinations on Applications of Manual Calculation Programs on Lung Cancer Radiation Therapy Using Analytical Anisotropic Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Min; Kim, Dae Sup; Hong, Dong Ki; Back, Geum Mun; Kwak, Jung Won [Dept. of Radiation Oncology, , Seoul (Korea, Republic of)

    2012-03-15

    There was a problem with using MU verification programs for the reasons that there were errors of MU when using MU verification programs based on Pencil Beam Convolution (PBC) Algorithm with radiation treatment plans around lung using Analytical Anisotropic Algorithm (AAA). On this study, we studied the methods that can verify the calculated treatment plans using AAA. Using Eclipse treatment planning system (Version 8.9, Varian, USA), for each 57 fields of 7 cases of Lung Stereotactic Body Radiation Therapy (SBRT), we have calculated using PBC and AAA with dose calculation algorithm. By developing MU of established plans, we compared and analyzed with MU of manual calculation programs. We have analyzed relationship between errors and 4 variables such as field size, lung path distance of radiation, Tumor path distance of radiation, effective depth that can affect on errors created from PBC algorithm and AAA using commonly used programs. Errors of PBC algorithm have showned 0.2{+-}1.0% and errors of AAA have showned 3.5{+-}2.8%. Moreover, as a result of analyzing 4 variables that can affect on errors, relationship in errors between lung path distance and MU, connection coefficient 0.648 (P=0.000) has been increased and we could calculate MU correction factor that is A.E=L.P 0.00903+0.02048 and as a result of replying for manual calculation program, errors of 3.5{+-}2.8% before the application has been decreased within 0.4{+-}2.0%. On this study, we have learned that errors from manual calculation program have been increased as lung path distance of radiation increases and we could verified MU of AAA with a simple method that is called MU correction factor.

  11. Examinations on Applications of Manual Calculation Programs on Lung Cancer Radiation Therapy Using Analytical Anisotropic Algorithm

    International Nuclear Information System (INIS)

    Kim, Jung Min; Kim, Dae Sup; Hong, Dong Ki; Back, Geum Mun; Kwak, Jung Won

    2012-01-01

    There was a problem with using MU verification programs for the reasons that there were errors of MU when using MU verification programs based on Pencil Beam Convolution (PBC) Algorithm with radiation treatment plans around lung using Analytical Anisotropic Algorithm (AAA). On this study, we studied the methods that can verify the calculated treatment plans using AAA. Using Eclipse treatment planning system (Version 8.9, Varian, USA), for each 57 fields of 7 cases of Lung Stereotactic Body Radiation Therapy (SBRT), we have calculated using PBC and AAA with dose calculation algorithm. By developing MU of established plans, we compared and analyzed with MU of manual calculation programs. We have analyzed relationship between errors and 4 variables such as field size, lung path distance of radiation, Tumor path distance of radiation, effective depth that can affect on errors created from PBC algorithm and AAA using commonly used programs. Errors of PBC algorithm have showned 0.2±1.0% and errors of AAA have showned 3.5±2.8%. Moreover, as a result of analyzing 4 variables that can affect on errors, relationship in errors between lung path distance and MU, connection coefficient 0.648 (P=0.000) has been increased and we could calculate MU correction factor that is A.E=L.P 0.00903+0.02048 and as a result of replying for manual calculation program, errors of 3.5±2.8% before the application has been decreased within 0.4±2.0%. On this study, we have learned that errors from manual calculation program have been increased as lung path distance of radiation increases and we could verified MU of AAA with a simple method that is called MU correction factor.

  12. Using analytic element models to delineate drinking water source protection areas.

    Science.gov (United States)

    Raymond, Heather A; Bondoc, Michael; McGinnis, John; Metropulos, Kathy; Heider, Pat; Reed, Allison; Saines, Steve

    2006-01-01

    Since 1999, Ohio EPA hydrogeologists have used two analytic element models (AEMs), the proprietary software GFLOW and U.S. EPA's WhAEM, to delineate protection areas for 535 public water systems. Both models now use the GFLOW2001 solution engine, integrate well with Geographic Information System (GIS) technology, have a user-friendly graphical interface, are capable of simulating a variety of complex hydrogeologic settings, and do not rely upon a model grid. These features simplify the modeling process and enable AEMs to bridge the gap between existing simplistic delineation methods and more complex numerical models. Ohio EPA hydrogeologists demonstrated that WhAEM2000 and GFLOW2000 were capable of producing capture zones similar to more widely accepted models by applying the AEMs to eight sites that had been previously delineated using other methods. After the Ohio EPA delineated protection areas using AEMs, more simplistic delineation methods used by other states (volumetric equation and arbitrary fixed radii) were applied to the same water systems to compare the differences between various methods. GIS software and two-tailed paired t-tests were used to quantify the differences in protection areas and analyze the data. The results of this analysis demonstrate that AEMs typically produce significantly different protection areas than the most simplistic delineation methods, in terms of total area and shape. If the volumetric equation had been used instead of AEMs, Ohio would not have protected 265 km2 of critical upgradient area and would have overprotected 269 km2 of primarily downgradient land. Since an increasing number of land-use restrictions are being tied to drinking water protection areas, this analysis has broad policy implications.

  13. Patients’ perception of quality service delivery of public hospitals in Nigeria using analytical hierarchy process

    Directory of Open Access Journals (Sweden)

    Emmanuel Olateju Oyatoye

    2016-07-01

    Full Text Available Introduction: Patients are recently more aware and conscious. This is because of the belief that a high level of quality can translate into patient satisfaction. This is critical for healthcare providers as they deal with life. This recognition by both the service provider and service receivers made the government to establish units of service commission (SERVICOM in each of the governmental agencies including hospitals in Nigeria to monitor the level of quality of service delivery. However, to what extent do patients’ perceptions about health services seem to have been largely recognized remain unclear by health care providers, despite the (SERVICOM units in public institutions in Nigeria? Method: A cross-sectional analytical study using convenient sample method, based on the fact that not every patient of the selected hospitals can be chosen, was performed on 400 patients who received health services at four different public hospitals in Ogun state Nigeria. The selection of these hospitals was based on the zones in the state (Egba, Ijebu, Remo and Yewa area of Ogun-state. The instrument was a valid and reliable analytical hierarchy process based questionnaire containing five service quality dimensions. Data were analyzed using SPSS, Expert choice and Microsoft Excel software to determine the perception of patients towards service quality delivery in pairwise comparison of judgment consistent at less than 10%. Results:The results showed the composite priorities of the patients’ perception with respect to determinants of the patients’ perception towards quality of services delivered in the public hospitals in Nigeria. The most important factor to patients was the reliability dimension with composite priority 0.24 or 24% followed by the responsiveness dimension with 0.22 assurance dimension 0.21, tangibility dimension with 0.21, and the least determinant factor was the empathy dimension with 0.1101. Conclusion: Based on the results, the

  14. Relativistic quantum mechanic calculation of photoionization cross-section of hydrogenic and non-hydrogenic states using analytical potentials

    International Nuclear Information System (INIS)

    Rodriguez, R.; Gil, J.M.; Rubiano, J.G.; Florido, R.; Martel, P.; Minguez, E.

    2005-01-01

    Photoionization process is a subject of special importance in many areas of physics. Numerical methods must be used in order to obtain photoionization cross-sections for non-hydrogenic levels. The atomic data required to calculate them is huge so self-consistent calculations increase computing time considerably. Analytical potentials are a useful alternative because they avoid the iterative procedures typical in self-consistent models. In this work, we present a relativistic quantum calculation of photoionization cross-sections for isolated ions based on an analytical potential to obtain the required atomic data, which is valid both for hydrogenic and non-hydrogenic ions. Comparisons between our results and others obtained using either widely used analytical expressions for the cross-sections or more sophisticated calculations are done

  15. Use of robotic systems for radiochemical sample changing and for analytical sample preparation

    International Nuclear Information System (INIS)

    Delmastro, J.R.; Hartenstein, S.D.; Wade, M.A.

    1989-01-01

    Two uses of the Perkin-Elmer (PE) robotic system will be presented. In the first, a PE robot functions as an automatic sample changer for up to five low energy photon spectrometry (LEPS) detectors operated with a Nuclear Data ND 6700 system. The entire system, including the robot, is controlled by an IBM PC-AT using software written in compiled BASIC. Problems associated with the development of the system and modifications to the robot will be presented. In the second, an evaluation study was performed to assess the abilities of the PE robotic system for performing complex analytical sample preparation procedures. For this study, a robotic system based upon the PE robot and auxiliary devices was constructed and programmed to perform the preparation of final product samples (UO 3 ) for accountability and impurity specification analyses. These procedures require sample dissolution, dilution, and liquid-liquid extraction steps. The results of an in-depth evaluation of all system components will be presented

  16. An analytical approach to characterize morbidity profile dissimilarity between distinct cohorts using electronic medical records.

    Science.gov (United States)

    Schildcrout, Jonathan S; Basford, Melissa A; Pulley, Jill M; Masys, Daniel R; Roden, Dan M; Wang, Deede; Chute, Christopher G; Kullo, Iftikhar J; Carrell, David; Peissig, Peggy; Kho, Abel; Denny, Joshua C

    2010-12-01

    We describe a two-stage analytical approach for characterizing morbidity profile dissimilarity among patient cohorts using electronic medical records. We capture morbidities using the International Statistical Classification of Diseases and Related Health Problems (ICD-9) codes. In the first stage of the approach separate logistic regression analyses for ICD-9 sections (e.g., "hypertensive disease" or "appendicitis") are conducted, and the odds ratios that describe adjusted differences in prevalence between two cohorts are displayed graphically. In the second stage, the results from ICD-9 section analyses are combined into a general morbidity dissimilarity index (MDI). For illustration, we examine nine cohorts of patients representing six phenotypes (or controls) derived from five institutions, each a participant in the electronic MEdical REcords and GEnomics (eMERGE) network. The phenotypes studied include type II diabetes and type II diabetes controls, peripheral arterial disease and peripheral arterial disease controls, normal cardiac conduction as measured by electrocardiography, and senile cataracts. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. Analytical scale purification of zirconia colloidal suspension using field programmed sedimentation field flow fractionation.

    Science.gov (United States)

    Van-Quynh, Alexandra; Blanchart, Philippe; Battu, Serge; Clédat, Dominique; Cardot, Philippe

    2006-03-03

    Sedimentation field flow fractionation was used to obtain purified fractions from a polydispersed zirconia colloidal suspension in the potential purpose of optical material hybrid coating. The zirconia particle size ranged from 50/70 nm to 1000 nm. It exhibited a log-Gaussian particle size distribution (in mass or volume) and a 115% polydispersity index (P.I.). Time dependent eluted fractions of the original zirconia colloidal suspension were collected. The particle size distribution of each fraction was determined with scanning electron microscopy and Coulter sub-micron particle sizer (CSPS). These orthogonal techniques generated similar data. From fraction average elution times and granulometry measurements, it was shown that zirconia colloids are eluted according to the Brownian elution mode. The four collected fractions have a Gaussian like distribution and respective average size and polydispersity index of 153 nm (P.I. = 34.7%); 188 nm (P.I. = 27.9%); 228 nm (P.I. = 22.6%), and 276 nm (P.I. = 22.3%). These data demonstrate the strong size selectivity of SdFFF operated with programmed field of exponential profile for sorting particles in the sub-micron range. Using this technique, the analytical production of zirconia of given average size and reduced polydispersity is possible.

  18. Reconstruction of binary geological images using analytical edge and object models

    Science.gov (United States)

    Abdollahifard, Mohammad J.; Ahmadi, Sadegh

    2016-04-01

    Reconstruction of fields using partial measurements is of vital importance in different applications in geosciences. Solving such an ill-posed problem requires a well-chosen model. In recent years, training images (TI) are widely employed as strong prior models for solving these problems. However, in the absence of enough evidence it is difficult to find an adequate TI which is capable of describing the field behavior properly. In this paper a very simple and general model is introduced which is applicable to a fairly wide range of binary images without any modifications. The model is motivated by the fact that nearly all binary images are composed of simple linear edges in micro-scale. The analytic essence of this model allows us to formulate the template matching problem as a convex optimization problem having efficient and fast solutions. The model has the potential to incorporate the qualitative and quantitative information provided by geologists. The image reconstruction problem is also formulated as an optimization problem and solved using an iterative greedy approach. The proposed method is capable of recovering the image unknown values with accuracies about 90% given samples representing as few as 2% of the original image.

  19. Chemical characterization of materials relevant to nuclear technology using neutron and proton based nuclear analytical methods

    International Nuclear Information System (INIS)

    Acharya, R.

    2014-01-01

    Nuclear analytical techniques (NATs), utilizing neutron and proton based nuclear reactions and subsequent measurement of gamma rays, are capable of chemical characterization of various materials at major to trace concentration levels. The present article deals with the recent developments and applications of conventional and k0-based internal monostandard (i) neutron activation analysis (NAA) and (ii) prompt gamma ray NAA (PGNAA) methods as well as (iii) in situ current normalized particle induced gamma ray emission (PIGE). The materials that have been analyzed by NAA and PGNAA include (i) nuclear reactor structural materials like zircaloys, stainless steels, Ni alloys, high purity aluminium and graphite and (ii) uranium oxide, U-Th mixed oxides, uranium ores and minerals. Internal monostandard NAA (IM-NAA) method with in situ detection efficiency was used to analyze large and non-standard geometry samples and standard-less compositional characterization was carried out for zircaloys and stainless steels. PIGE methods using proton beams were standardized for quantification of low Z elements (Li to Ti) and applied for compositional analysis of borosilicate glass and lithium titanate (Li 2 TiO 3 ) samples and quantification of total B and its isotopic composition of B ( 10 B/ 11 B) in boron based neutron absorbers like B 4 C. (author)

  20. On the use of the analytic hierarchy process in the aggregation of expert judgments

    International Nuclear Information System (INIS)

    Zio, E.

    1996-01-01

    Expert judgments are involved in many aspects of scientific research, either formally or informally. In order to combine the different opinions elicited, simple aggregation methods have often been used with the result that expert biases, interexpert dependencies and other factors which might affect the judgments of the experts are often ignored. A more comprehensive approach, based on the analytic hierarchy process, is proposed in this paper to account for the large variety of factors influencing the experts. A structured hierarchy is constructed to decompose the overall problem in the elementary factors that can influence the expert's judgements. The importance of the different elements of the hierarchy is then assessed by pairwise comparison. The overall approach is simple, presents a systematic character and offers a good degree of flexibility. The approach provides the decision maker with a tool to quantitatively measure the significance of the judgments provided by the different experts involved in the elicitation. The resulting values can be used as weights in an aggregation scheme such as, for example, the simple weighted averaging scheme. Two applications of the approach are presented with reference to case studies of formal expert judgment elicitation previously analyzed in literature: the elicitation of the pressure increment in the containment building of the Sequoyah nuclear power plant following reactor vessel breach, and the prediction of the future changes in precipitation in the vicinity of Yucca Mountain

  1. Validation of analytical methods for the stability studies of naproxen suppositories for infant and adult use

    International Nuclear Information System (INIS)

    Rodriguez Hernandez, Yaslenis; Suarez Perez, Yania; Garcia Pulpeiro, Oscar

    2011-01-01

    Analytical and validating studies were performed in this paper, with a view to using them in the stability studies of the future formulations of naproxen suppositories for children and adults. The most influential factors in the naproxen stability were determined, that is, the major degradation occurred in acid medium, oxidative medium and by light action. One high-performance liquid chromatography-based method was evaluated, which proved to be adequate to quantify naproxen in suppositories and was selective against degradation products. The quantification limit was 3,480 μg, so it was valid for these studies. Additionally, the parameters specificity for stability, detection and quantification limits were evaluated for the direct semi-aqueous acid-base method, which was formerly validated for the quality control and showed satisfactory results. Nevertheless, the volumetric methods were not regarded as stability indicators; therefore, this method will be used along with the chromatographic methods of choice, that is, thin-layer chromatography and highperformance liquid chromatography, to determine the degradation products

  2. Evaluation of alternative fuels for the Greek road transport sector using the analytic hierarchy process

    International Nuclear Information System (INIS)

    Tsita, Katerina G.; Pilavachi, Petros A.

    2012-01-01

    This paper evaluates alternative fuels for the Greek road transport sector, using the Analytic Hierarchy Process. Seven different alternatives of fuel mode are considered in this paper: internal combustion engine (ICE) and its combination with petroleum and 1st and 2nd generation biofuels blends, fuel cells, hybrid vehicles, plug-in hybrids and electric vehicles. The evaluation of alternative fuel modes is performed according to cost and policy aspects. In order to evaluate each alternative fuel, one base scenario and ten alternative scenarios with different weight factors selection per criterion are presented. After deciding the alternative fuels’ scoring against each criterion and the criteria weights, their synthesis gives the overall score and ranking for all ten alternative scenarios. It is concluded that ICE blended with 1st and 2nd generation biofuels are the most suitable alternative fuels for the Greek road transport sector. - Highlights: ► Alternative fuels for the Greek road transport sector have been evaluated. ► The method of the AHP was used. ► Seven different alternatives of fuel mode are considered. ► The evaluation is performed according to cost and policy aspects. ► The ICE with 1st and 2nd generation biofuels are the most suitable fuels.

  3. Application of multi attribute failure mode analysis of milk production using analytical hierarchy process method

    Science.gov (United States)

    Rucitra, A. L.

    2018-03-01

    Pusat Koperasi Induk Susu (PKIS) Sekar Tanjung, East Java is one of the modern dairy industries producing Ultra High Temperature (UHT) milk. A problem that often occurs in the production process in PKIS Sekar Tanjung is a mismatch between the production process and the predetermined standard. The purpose of applying Analytical Hierarchy Process (AHP) was to identify the most potential cause of failure in the milk production process. Multi Attribute Failure Mode Analysis (MAFMA) method was used to eliminate or reduce the possibility of failure when viewed from the failure causes. This method integrates the severity, occurrence, detection, and expected cost criteria obtained from depth interview with the head of the production department as an expert. The AHP approach was used to formulate the priority ranking of the cause of failure in the milk production process. At level 1, the severity has the highest weight of 0.41 or 41% compared to other criteria. While at level 2, identifying failure in the UHT milk production process, the most potential cause was the average mixing temperature of more than 70 °C which was higher than the standard temperature (≤70 ° C). This failure cause has a contributes weight of 0.47 or 47% of all criteria Therefore, this study suggested the company to control the mixing temperature to minimise or eliminate the failure in this process.

  4. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  5. Elementary mechanics using Matlab a modern course combining analytical and numerical techniques

    CERN Document Server

    Malthe-Sørenssen, Anders

    2015-01-01

    This book – specifically developed as a novel textbook on elementary classical mechanics – shows how analytical and numerical methods can be seamlessly integrated to solve physics problems. This approach allows students to solve more advanced and applied problems at an earlier stage and equips them to deal with real-world examples well beyond the typical special cases treated in standard textbooks. Another advantage of this approach is that students are brought closer to the way physics is actually discovered and applied, as they are introduced right from the start to a more exploratory way of understanding phenomena and of developing their physical concepts. While not a requirement, it is advantageous for the reader to have some prior knowledge of scientific programming with a scripting-type language. This edition of the book uses Matlab, and a chapter devoted to the basics of scientific programming with Matlab is included. A parallel edition using Python instead of Matlab is also available. Last but not...

  6. Elementary mechanics using Python a modern course combining analytical and numerical techniques

    CERN Document Server

    Malthe-Sørenssen, Anders

    2015-01-01

    This book – specifically developed as a novel textbook on elementary classical mechanics – shows how analytical and numerical methods can be seamlessly integrated to solve physics problems. This approach allows students to solve more advanced and applied problems at an earlier stage and equips them to deal with real-world examples well beyond the typical special cases treated in standard textbooks. Another advantage of this approach is that students are brought closer to the way physics is actually discovered and applied, as they are introduced right from the start to a more exploratory way of understanding phenomena and of developing their physical concepts. While not a requirement, it is advantageous for the reader to have some prior knowledge of scientific programming with a scripting-type language. This edition of the book uses Python, and a chapter devoted to the basics of scientific programming with Python is included. A parallel edition using Matlab instead of Python is also available. Last but not...

  7. Air particulate pollution studies in Asian countries using nuclear analytical techniques

    International Nuclear Information System (INIS)

    Hien, P.D.

    1998-01-01

    Air particulate pollution is regarded as critical in Asian cities. The levels of suspended particulate matter in major Asian cities far exceed the WHO's guideline. Nuclear analytical techniques have been widely used in the studies of air particulate pollution to provide aerosol elemental compositions for the purpose of deriving the structure of emission sources. This paper presents some preliminary observations and findings based on publications in scientific literatures. Data on PM-10 levels and socio-economic indicators are used for searching a relationship between air quality and the level of development across Asia. An inverse linear relationship between PM-10 levels and logarithm of per capita GDP appears to exist, although there are large fluctuations of data caused by the very different climatic and geographical conditions of cities studied. Soil dust is generally a major, or even predominant aerosol source in Asian cities. Other common sources include vehicular emissions, coal and oil combustion, burning of refuse (in open) and biomass (including forest fires). The relevance and the trends of these sources in Asian context are discussed. Multivariate receptor modelling techniques applied in source characterization are illustrated through the cases of Lahore and Hochiminh City. Although having limitations in dealing with mixing and overlapping sources, receptor modelling based on principal component factor analysis has been proven to be uncomplicated and sufficiently reliable for characterising aerosol sources in urban areas. (author)

  8. Analytical method for analysis of electromagnetic scattering from inhomogeneous spherical structures using duality principles

    Science.gov (United States)

    Kiani, M.; Abdolali, A.; Safari, M.

    2018-03-01

    In this article, an analytical approach is presented for the analysis of electromagnetic (EM) scattering from radially inhomogeneous spherical structures (RISSs) based on the duality principle. According to the spherical symmetry, similar angular dependencies in all the regions are considered using spherical harmonics. To extract the radial dependency, the system of differential equations of wave propagation toward the inhomogeneity direction is equated with the dual planar ones. A general duality between electromagnetic fields and parameters and scattering parameters of the two structures is introduced. The validity of the proposed approach is verified through a comprehensive example. The presented approach substitutes a complicated problem in spherical coordinate to an easy, well posed, and previously solved problem in planar geometry. This approach is valid for all continuously varying inhomogeneity profiles. One of the major advantages of the proposed method is the capability of studying two general and applicable types of RISSs. As an interesting application, a class of lens antenna based on the physical concept of the gradient refractive index material is introduced. The approach is used to analyze the EM scattering from the structure and validate strong performance of the lens.

  9. Low-velocity Impact Response of a Nanocomposite Beam Using an Analytical Model

    Directory of Open Access Journals (Sweden)

    Mahdi Heydari Meybodi

    Full Text Available AbstractLow-velocity impact of a nanocomposite beam made of glass/epoxy reinforced with multi-wall carbon nanotubes and clay nanoparticles is investigated in this study. Exerting modified rule of mixture (MROM, the mechanical properties of nanocomposite including matrix, nanoparticles or multi-wall carbon nanotubes (MWCNT, and fiber are attained. In order to analyze the low-velocity impact, Euler-Bernoulli beam theory and Hertz's contact law are simultaneously employed to govern the equations of motion. Using Ritz's variational approximation method, a set of nonlinear equations in time domain are obtained, which are solved using a fourth order Runge-Kutta method. The effect of different parameters such as adding nanoparticles or MWCNT's on maximum contact force and energy absorption, stacking sequence, geometrical dimensions (i.e., length, width and height, and initial velocity of the impactor have been studied comprehensively on dynamic behavior of the nanocomposite beam. In addition, the result of analytical model is compared with Finite Element Modeling (FEM.The results reveal that the effect of nanoparticles on energy absorption is more considerable at higher impact energies.

  10. Effective factors on optimizing banks’ balance sheet using fuzzy analytical hierarchy process

    Directory of Open Access Journals (Sweden)

    Shoja Rezaei

    2013-11-01

    Full Text Available Every bank seeks methods to optimize its assets and liabilities, thus the main subject is managing assets-liabilities in the balance sheet and the main question is by which factor banks will be enabled to have an optimized combination of assets and liabilities in a common level of risk to get the most return. This case study is dedicated to Refah bank and is an applicable study. The data has collected from the headquarter by a questionnaire and finally effective factors weight on optimizing bank balance sheet determined by using Fuzzy analytical hierarchy process. Results showed that revenue has more effect on optimizing for %39.5 and also loan to deposit ratio for %.74, regarding revenue as a symbol of efficiency in banks, it seems to be the most important factor and goal in banking industry. Furthermore banks need to have some liquidity to respond customers demand to cover one of the most important risks of banking. This factor importance determined to be %18 in Refah Bank by using model and experts view.

  11. Natural language processing using online analytic processing for assessing recommendations in radiology reports.

    Science.gov (United States)

    Dang, Pragya A; Kalra, Mannudeep K; Blake, Michael A; Schultz, Thomas J; Stout, Markus; Lemay, Paul R; Freshman, David J; Halpern, Elkan F; Dreyer, Keith J

    2008-03-01

    The study purpose was to describe the use of natural language processing (NLP) and online analytic processing (OLAP) for assessing patterns in recommendations in unstructured radiology reports on the basis of patient and imaging characteristics, such as age, gender, referring physicians, radiology subspecialty, modality, indications, diseases, and patient status (inpatient vs outpatient). A database of 4,279,179 radiology reports from a single tertiary health care center during a 10-year period (1995-2004) was created. The database includes reports of computed tomography, magnetic resonance imaging, fluoroscopy, nuclear medicine, ultrasound, radiography, mammography, angiography, special procedures, and unclassified imaging tests with patient demographics. A clinical data mining and analysis NLP program (Leximer, Nuance Inc, Burlington, Massachusetts) in conjunction with OLAP was used for classifying reports into those with recommendations (I(REC)) and without recommendations (N(REC)) for imaging and determining I(REC) rates for different patient age groups, gender, imaging modalities, indications, diseases, subspecialties, and referring physicians. In addition, temporal trends for I(REC) were also determined. There was a significant difference in the I(REC) rates in different age groups, varying between 4.8% (10-19 years) and 9.5% (>70 years) (P OLAP revealed considerable differences between recommendation trends for different imaging modalities and other patient and imaging characteristics.

  12. An Analytical Method to Measure Free-Water Tritium in Foods using Azeotropic Distillation.

    Science.gov (United States)

    Soga, Keisuke; Kamei, Toshiyuki; Hachisuka, Akiko; Nishimaki-Mogami, Tomoko

    2016-01-01

    A series of accidents at the Fukushima Dai-ichi Nuclear Power Plant has raised concerns about the discharge of contaminated water containing tritium ((3)H) from the nuclear power plant into the environment and into foods. In this study, we explored convenient analytical methods to measure free-water (3)H in foods using a liquid scintillation counting and azeotropic distillation method. The detection limit was 10 Bq/L, corresponding to about 0.01% of 1 mSv/year. The (3)H recoveries were 85-90% in fruits, vegetables, meats and fishes, 75-85% in rice and cereal crops, and less than 50% in sweets containing little water. We found that, in the case of sweets, adding water to the sample before the azeotropic distillation increased the recovery and precision. Then, the recoveries reached more than 75% and RSD was less than 10% in all food categories (13 kinds). Considering its sensitivity, precision and simplicity, this method is practical and useful for (3)H analysis in various foods, and should be suitable for the safety assessment of foods. In addition, we examined the level of (3)H in foods on the Japanese market. No (3)H radioactivity was detected in any of 42 analyzed foods.

  13. Determination of flexibility factors in curved pipes with end restraints using a semi-analytic formulation

    International Nuclear Information System (INIS)

    Fonseca, E.M.M.; Melo, F.J.M.Q. de; Oliveira, C.A.M.

    2002-01-01

    Piping systems are structural sets used in the chemical industry, conventional or nuclear power plants and fluid transport in general-purpose process equipment. They include curved elements built as parts of toroidal thin-walled structures. The mechanical behaviour of such structural assemblies is of leading importance for satisfactory performance and safety standards of the installations. This paper presents a semi-analytic formulation based on Fourier trigonometric series for solving the pure bending problem in curved pipes. A pipe element is considered as a part of a toroidal shell. A displacement formulation pipe element was developed with Fourier series. The solution of this problem is solved from a system of differential equations using mathematical software. To build-up the solution, a simple but efficient deformation model, from a semi-membrane behaviour, was followed here, given the geometry and thin shell assumption. The flexibility factors are compared with the ASME code for some elbow dimensions adopted from ISO 1127. The stress field distribution was also calculated

  14. The use of the analytic hierarchy process to aid decision making in acquired equinovarus deformity.

    Science.gov (United States)

    van Til, Janine A; Renzenbrink, Gerbert J; Dolan, James G; Ijzerman, Maarten J

    2008-03-01

    To increase the transparency of decision making about treatment in patients with equinovarus deformity poststroke. The analytic hierarchy process (AHP) was used as a structured methodology to study the subjective rationale behind choice of treatment. An 8-hour meeting at a centrally located rehabilitation center in The Netherlands, during which a patient video was shown to all participants (using a personal computer and a large screen) and the patient details were provided on paper. A panel of 10 health professionals from different backgrounds. Not applicable. The performance of the applicable treatments on outcome, impact, comfort, cosmetics, daily effort, and risks and side effects of treatment, as well as the relative importance of criteria in the choice of treatment. According to the model, soft-tissue surgery (.413) ranked first as the preferred treatment, followed by orthopedic footwear (.181), ankle-foot orthosis (.147), surface electrostimulation (.137), and finally implanted electrostimulation (.123). Outcome was the most influential consideration affecting treatment choice (.509), followed by risk and side effects (.194), comfort (.104), daily effort (.098), cosmetics (.065), and impact of treatment (.030). Soft-tissue surgery was judged best on outcome, daily effort, comfortable shoe wear, and cosmetically acceptable result and was thereby preferred as a treatment alternative by the panel in this study. In contrast, orthosis and orthopedic footwear are usually preferred in daily practice. The AHP method was found to be suitable methodology for eliciting subjective opinions and quantitatively comparing treatments in the absence of scientific evidence.

  15. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes.

    Directory of Open Access Journals (Sweden)

    Yang Shen

    Full Text Available China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can't have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis.

  16. An analytical solution for two-dimensional vacuum preloading combined with electro-osmosis consolidation using EKG electrodes

    Science.gov (United States)

    Qiu, Chenchen; Li, Yande

    2017-01-01

    China is a country with vast territory, but economic development and population growth have reduced the usable land resources in recent years. Therefore, reclamation by pumping and filling is carried out in eastern coastal regions of China in order to meet the needs of urbanization. However, large areas of reclaimed land need rapid drainage consolidation treatment. Based on past researches on how to improve the treatment efficiency of soft clay using vacuum preloading combined with electro-osmosis, a two-dimensional drainage plane model was proposed according to the Terzaghi and Esrig consolidation theory. However, the analytical solution using two-dimensional plane model was never involved. Current analytical solutions can’t have a thorough theoretical analysis of practical engineering and give relevant guidance. Considering the smearing effect and the rectangle arrangement pattern, an analytical solution is derived to describe the behavior of pore-water and the consolidation process by using EKG (electro-kinetic geo synthetics) materials. The functions of EKG materials include drainage, electric conduction and corrosion resistance. Comparison with test results is carried out to verify the analytical solution. It is found that the measured value is larger than the applied vacuum degree because of the stacking effect of the vacuum preloading and electro-osmosis. The trends of the mean measured value and the mean analytical value processes are comparable. Therefore, the consolidation model can accurately assess the change in pore-water pressure and the consolidation process during vacuum preloading combined with electro-osmosis. PMID:28771496

  17. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  18. Enhancing Competitiveness Business Strategy of Organic Vegetables Using Analytical Hierarchy Process (AHP

    Directory of Open Access Journals (Sweden)

    Pristiana Widyastuti

    2017-09-01

    Full Text Available Public awareness about healthy lifestyles leads people to want to understand more about the food they consume. Choosing organic vegetables is one alternative choices when seeking to have a healthy body and healthy lifestyle. Unfortunately, not a lot of organic vegetable farmers in Indonesia succeed in seizing the organic vegetable market rather than the non-organic and the competition with imported organic vegetables into Indonesia prevents farmers from thriving. This study aims to: 1 Analyze the factors affecting the competitiveness of the organic vegetables market; 2 Analyze the appropriate strategy for increasing the competitiveness of the organic vegetables market; 3 Analyze the factors priority strategies for improving the competitiveness of the organic vegetables market. Porter's Generic Model and Analysis Analytical Hierarchy Process (AHP is used to determine the best strategy. The research found that organic vegetables marketing channels are still dominated by conventional market; the higher cost for intensive cultivation of organic vegetables. The main strategies are derived from the analysis is focusing on market delivery. There needs to be retailers of organic vegetables either modern or traditional to display these products. The establishment of organic vegetable outlets and online marketing that are not dependent on large retail (hypermarket become recommendations in this study. Bahasa Indonesia Abstrak: Kesadaran masyarakat tentang gaya hidup sehat memberi pilihan kepada masyarakat untuk memahami makanan yang mereka konsumsi. Pilihan sayuran organik merupakan salah satu alternatif untuk memiliki tubuh sehat dan gaya hidup sehat bagi masyarakat. Sayangnya, tidak banyak petani sayuran organik di Indonesia yang berhasil merebut pasar sayuran organik daripada non organic. Persaingan produk impor sayuran organik ke Indonesia membuat petani tidak bisa berkembang. Penelitian ini bertujuan untuk: 1 Menganalisis faktor-faktor yang

  19. Who Uses Mobile Phone Health Apps and Does Use Matter? A Secondary Data Analytics Approach.

    Science.gov (United States)

    Carroll, Jennifer K; Moorhead, Anne; Bond, Raymond; LeBlanc, William G; Petrella, Robert J; Fiscella, Kevin

    2017-04-19

    Mobile phone use and the adoption of healthy lifestyle software apps ("health apps") are rapidly proliferating. There is limited information on the users of health apps in terms of their social demographic and health characteristics, intentions to change, and actual health behaviors. The objectives of our study were to (1) to describe the sociodemographic characteristics associated with health app use in a recent US nationally representative sample; (2) to assess the attitudinal and behavioral predictors of the use of health apps for health promotion; and (3) to examine the association between the use of health-related apps and meeting the recommended guidelines for fruit and vegetable intake and physical activity. Data on users of mobile devices and health apps were analyzed from the National Cancer Institute's 2015 Health Information National Trends Survey (HINTS), which was designed to provide nationally representative estimates for health information in the United States and is publicly available on the Internet. We used multivariable logistic regression models to assess sociodemographic predictors of mobile device and health app use and examine the associations between app use, intentions to change behavior, and actual behavioral change for fruit and vegetable consumption, physical activity, and weight loss. From the 3677 total HINTS respondents, older individuals (45-64 years, odds ratio, OR 0.56, 95% CI 0.47-68; 65+ years, OR 0.19, 95% CI 0.14-0.24), males (OR 0.80, 95% CI 0.66-0.94), and having degree (OR 2.83, 95% CI 2.18-3.70) or less than high school education (OR 0.43, 95% CI 0.24-0.72) were all significantly associated with a reduced likelihood of having adopted health apps. Similarly, both age and education were significant variables for predicting whether a person had adopted a mobile device, especially if that person was a college graduate (OR 3.30). Individuals with apps were significantly more likely to report intentions to improve fruit (63

  20. A Development of Group Decision Support System for Strategic Item Classification using Analytic Hierarchy Process

    International Nuclear Information System (INIS)

    Yoon, Sung Ho; Tae, Jae Woong; Yang, Seung Hyo; Shin, Dong Hoon

    2016-01-01

    Korea has carried out export controls on nuclear items that reflect the Nuclear Suppliers Group (NSG) guidelines (Notice on Trade of Strategic Item of Foreign Trade Act) since joining the NSG in 1995. Nuclear export control starts with classifications that determine whether export items are relevant to nuclear proliferation or not according to NSG guidelines. However, due to qualitative characteristics of nuclear item definition in the guidelines, classification spends a lot of time and effort to make a consensus. The aim of this study is to provide an analysis of an experts' group decision support system (GDSS) based on an analytic hierarchy process (AHP) for the classification of strategic items. The results of this study clearly demonstrated that a GDSS based on an AHP proved positive, systematically providing relative weight among the planning variables and objectives. By using an AHP we can quantify the subjective judgements of reviewers. An order of priority is derived from a numerical value. The verbal and fuzzy measurement of an AHP enables us reach a consensus among reviewers in a GDSS. An AHP sets common weight factors which are a priority of each attribute that represent the views of an entire group. It makes a consistency in decision-making that is important for classification