WorldWideScience

Sample records for igto implementation analyses

  1. Implementing partnerships in nonreactor facility safety analyses

    International Nuclear Information System (INIS)

    Courtney, J.C.; Perry, W.H.; Phipps, R.D.

    1996-01-01

    Faculty and students from LSU have been participating in nuclear safety analyses and radiation protection projects at ANL-W at INEL since 1973. A mutually beneficial relationship has evolved that has resulted in generation of safety-related studies acceptable to Argonne and DOE, NRC, and state regulatory groups. Most of the safety projects have involved the Hot Fuel Examination Facility or the Fuel Conditioning Facility; both are hot cells that receive spent fuel from EBR-II. A table shows some of the major projects at ANL-W that involved LSU students and faculty

  2. Understanding ERP system implementation in a hospital by analysing stakeholders

    NARCIS (Netherlands)

    Boonstra, A.; Govers, M.

    Implementing enterprise resource planning (ERP) systems requires significant organisational, as well as technical, changes. These will affect stakeholders with varying perspectives and interests in the system. This is particularly the case in health care, as a feature of this sector is that

  3. Fast FPGA Implementation of an Original Impedance Analyser

    Directory of Open Access Journals (Sweden)

    Abdulrahman HAMED

    2011-02-01

    Full Text Available This article describes in detail the design and rapid prototyping of an embedded impedance analyzer. The measurement principle is based on the feedback control of the excitation voltage VD during a fast frequency sweeping. This function is carried out by a high precision synthesizer whose output resistance RG is digitally adjustable. Real and imaginary parts of the dipole impedance are determined from RG and the phase of VD. The digital architecture design uses the hardware-in-the-loop simulation in which the dipole is modeled using an RLC parallel circuit and a Butterworth Van Dyke structure. All digital functions are implemented on a Stratix II FPGA board with a 100 MHz frequency clock. The parameters taken into account are the frequency range (0 to 5 MHz, speed and resolution of the analysis and the quality factor of the resonant dipole. To reduce the analysis duration, the frequency sweeping rate is adjusted in real time.

  4. Analysing the differences between theoretical and implemented supply chain strategies in selected organisations

    OpenAIRE

    Danie J. Nel; Johanna A. Badenhorst-Weiss

    2011-01-01

    Organisations can use supply chain strategies to gain a competitive advantage for the supply chain. A competitive advantage can be achieved by means of low cost or by means of differentiation. However, organisations have to implement the correct supply chain strategy. Returns on investment can be compromised if organisations implement an incorrect supply chain strategy. The objective of the article is to analyse the differences between theoretically implied and implemented supply chain strate...

  5. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    Directory of Open Access Journals (Sweden)

    Annie Chu

    2009-04-01

    Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

  6. The Implementation of Police Reform in Bosnia and Herzegovina: Analysing UN and EU Efforts

    Directory of Open Access Journals (Sweden)

    Amelia Padurariu

    2014-01-01

    Full Text Available This article analyses the role of the main international actors involved in the implementation of police reform in post-conflict Bosnia and Herzegovina, notably that of the UN and the EU. Despite considerable efforts and resources deployed over 17 years, the implementation of police reform remains an ‘unfinished business’ that demonstrates the slow pace of implementing rule of law reforms in Bosnia’s post-conflict setting, yet, in the long-term, remains vital for Bosnia’s stability and post-conflict reconstruction process. Starting with a presentation of the status of the police before and after the conflict, UN reforms (1995–2002 are first discussed in order to set the stage for an analysis of the role of the EU in the implementation of police reform. Here, particular emphasis is placed on the institution-building actions of the EU police mission in Bosnia and Herzegovina deployed on the ground for almost a decade (2003-June 2012. The article concludes with an overall assessment of UN and EU efforts in post-conflict Bosnia and Herzegovina, including the remaining challenges encountered by the EU on the ground, as the current leader to police reform implementation efforts. More generally, the article highlights that for police reform to succeed in the long-term, from 2012-onwards, the EU should pay particular attention to the political level, where most of the stumbling blocks for the implementation of police reform lie.

  7. Current regulatory developments concerning the implementation of probabilistic safety analyses for external hazards in Germany

    International Nuclear Information System (INIS)

    Krauss, Matias; Berg, Heinz-Peter

    2014-01-01

    The Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU) initiated in September 2003 a comprehensive program for the revision of the national nuclear safety regulations which has been successfully completed in November 2012. These nuclear regulations take into account the current recommendations of the International Atomic Energy Agency (IAEA) and Western European Nuclear Regulators Association (WENRA). In this context, the recommendations and guidelines of the Nuclear Safety Standards Commission (KTA) and the technical documents elaborated by the respective expert group on Probabilistic Safety Analysis for Nuclear Power Plants (FAK PSA) are being updated or in the final process of completion. A main topic of the revision was the issue external hazards. As part of this process and in the light of the accident at Fukushima and the findings of the related actions resulting in safety reviews of nuclear power plants at national level in Germany and on European level, a revision of all relevant standards and documents has been made, especially the recommendations of KTA and FAK PSA. In that context, not only design issues with respect to events such as earthquakes and floods have been discussed, but also methodological issues regarding the implementation of improved probabilistic safety analyses on this topic. As a result of the revision of the KTA 2201 series 'Design of Nuclear Power Plants against Seismic Events' with their parts 1 to 6, part 1 'Principles' was published as the first standard in November 2011, followed by the revised versions of KTA 2201.2 (soil) and 2201.4 (systems and components) in 2012. The modified the standard KTA 2201.3 (structures) is expected to be issued before the end of 2013. In case of part 5 (seismic instrumentation) and part 6 (post>seismic actions) draft amendments are expected in 2013. The expert group 'Probabilistic Safety Assessments for Nuclear Power Plants' (FAK PSA) is an advisory body of the Federal

  8. Analysing the agricultural cost and non-market benefits of implementing the water framework directive

    NARCIS (Netherlands)

    Bateman, I.J.; Brouwer, R.; Davies, H.; Day, B.H.; Deflandre, A.; Di Falco, S.; Georgiou, S.; Hadley, D.; Hutchins, M.; Jones, A.P.; Kay, D.; Leeks, G.; Lewis, M.; Lovett, A.A.; Neal, C.; Posen, P.; Rigby, D.; Turner, R.K.

    2006-01-01

    Implementation of the Water Framework Directive (WFD) represents a fundamental change in the management of water in Europe with a requirement that member states ensure 'good ecological status' for all water bodies by 2015. Agriculture is expected to bear a major share of WFD implementation costs as

  9. Simple Crosscutting Concerns Are Not So Simple : Analysing Variability in Large-Scale Idioms-Based Implementations

    NARCIS (Netherlands)

    Bruntink, M.; Van Deursen, A.; d’Hondt, M.; Tourwé, T.

    2007-01-01

    This paper describes a method for studying idioms-based implementations of crosscutting concerns, and our experiences with it in the context of a real-world, large-scale embedded software system. In particular, we analyse a seemingly simple concern, tracing, and show that it exhibits significant

  10. Benefit Analyses of Technologies for Automatic Identification to Be Implemented in the Healthcare Sector

    Science.gov (United States)

    Krey, Mike; Schlatter, Ueli

    The tasks and objectives of automatic identification (Auto-ID) are to provide information on goods and products. It has already been established for years in the areas of logistics and trading and can no longer be ignored by the German healthcare sector. Some German hospitals have already discovered the capabilities of Auto-ID. Improvements in quality, safety and reductions in risk, cost and time are aspects and areas where improvements are achievable. Privacy protection, legal restraints, and the personal rights of patients and staff members are just a few aspects which make the heath care sector a sensible field for the implementation of Auto-ID. Auto-ID in this context contains the different technologies, methods and products for the registration, provision and storage of relevant data. With the help of a quantifiable and science-based evaluation, an answer is sought as to which Auto-ID has the highest capability to be implemented in healthcare business.

  11. Analyse of The Legal Framework in Colombia for implementation of Bioprospecting Practices

    International Nuclear Information System (INIS)

    Duarte, Oscar; Velho Lea

    2008-01-01

    The practice of bioprospecting is inherently linked with traditional knowledge and practices of local communities in the South as well as with the commercial activities of industries (e.g., pharmaceutics sector, agriculture) in the North. A series of actors operate at this interface, such as Non-Governmental Organizations (NGOs), Research Centers, Universities, Science and Technology sponsor institutions and the State. As these actors have divergent interests and powers of negotiation, an appropriate regulatory framework is necessary to regulate their interaction. This paper analyzes the existing legal framework in a mega-diverse country, like Colombia, for implementation of bioprospecting practices. The research consisted of two key components: (i) A review of the state of art of bioprospecting; (ii) A work in situ in Colombia, which consisted of analysis of information and genetic resources related to bioprospecting, participation in the implementation of a legal frame for bioprospecting practices and interviews with Colombian professionals in the field of biodiversity conservation. Our research determined that: (i) national authorities encounter a multitude of difficulties to implement a legal framework in Colombia, especially the Andean regional normativity; (ii) the execution of research projects related to bioprospecting in Colombia faces numerous challenges

  12. An evaluation system for electronic retrospective analyses in radiation oncology: implemented exemplarily for pancreatic cancer

    Science.gov (United States)

    Kessel, Kerstin A.; Jäger, Andreas; Bohn, Christian; Habermehl, Daniel; Zhang, Lanlan; Engelmann, Uwe; Bougatf, Nina; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E.

    2013-03-01

    To date, conducting retrospective clinical analyses is rather difficult and time consuming. Especially in radiation oncology, handling voluminous datasets from various information systems and different documentation styles efficiently is crucial for patient care and research. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using analysis tools connected with a documentation system. A total number of 783 patients have been documented into a professional, web-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported. For patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose-volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are stored in the database and included in statistical calculations. The main goal of using an automatic evaluation system is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the evaluation system to other types of tumors in radiation oncology.

  13. Structural Performance’s Optimally Analysing and Implementing Based on ANSYS Technology

    Science.gov (United States)

    Han, Na; Wang, Xuquan; Yue, Haifang; Sun, Jiandong; Wu, Yongchun

    2017-06-01

    Computer-aided Engineering (CAE) is a hotspot both in academic field and in modern engineering practice. Analysis System(ANSYS) simulation software for its excellent performance become outstanding one in CAE family, it is committed to the innovation of engineering simulation to help users to shorten the design process, improve product innovation and performance. Aimed to explore a structural performance’s optimally analyzing model for engineering enterprises, this paper introduced CAE and its development, analyzed the necessity for structural optimal analysis as well as the framework of structural optimal analysis on ANSYS Technology, used ANSYS to implement a reinforced concrete slab structural performance’s optimal analysis, which was display the chart of displacement vector and the chart of stress intensity. Finally, this paper compared ANSYS software simulation results with the measured results,expounded that ANSYS is indispensable engineering calculation tools.

  14. Statistical Analyses and Modeling of the Implementation of Agile Manufacturing Tactics in Industrial Firms

    Directory of Open Access Journals (Sweden)

    Mohammad D. AL-Tahat

    2012-01-01

    Full Text Available This paper provides a review and introduction on agile manufacturing. Tactics of agile manufacturing are mapped into different production areas (eight-construct latent: manufacturing equipment and technology, processes technology and know-how, quality and productivity improvement, production planning and control, shop floor management, product design and development, supplier relationship management, and customer relationship management. The implementation level of agile manufacturing tactics is investigated in each area. A structural equation model is proposed. Hypotheses are formulated. Feedback from 456 firms is collected using five-point-Likert-scale questionnaire. Statistical analysis is carried out using IBM SPSS and AMOS. Multicollinearity, content validity, consistency, construct validity, ANOVA analysis, and relationships between agile components are tested. The results of this study prove that the agile manufacturing tactics have positive effect on the overall agility level. This conclusion can be used by manufacturing firms to manage challenges when trying to be agile.

  15. Design and implementation of a modular program system for the carrying-through of statistical analyses

    International Nuclear Information System (INIS)

    Beck, W.

    1984-01-01

    From the complexity of computer programs for the solution of scientific and technical problems results a lot of questions. Typical questions concern the strength and weakness of computer programs, the propagation of incertainties among the input data, the sensitivity of input data on output data and the substitute of complex models by more simple ones, which provide equivalent results in certain ranges. Those questions have a general practical meaning, principle answers may be found by statistical methods, which are based on the Monte Carlo Method. In this report the statistical methods are chosen, described and valuated. They are implemented into the modular program system STAR, which is an own component of the program system RSYST. The design of STAR considers users with different knowledge of data processing and statistics. The variety of statistical methods, generating and evaluating procedures. The processing of large data sets in complex structures. The coupling to other components of RSYST and RSYST foreign programs. That the system can be easily modificated and enlarged. Four examples are given, which demonstrate the application of STAR. (orig.) [de

  16. Implementation of a laboratory information management system for environmental regulatory analyses

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, W.A.; Aiken, H.B.; Spatz, T.L.; Miles, W.F.; Griffin, J.C.

    1993-09-07

    The Savannah River Technology Center created a second instance of its ORACLE based PEN LIMS to support site Environmental Restoration projects. The first instance of the database had been optimized for R&D support and did not implement rigorous sample tracking, verification, and holding times needed to support regulatory commitments. Much of the R&D instance was transferable such as the work control functions for backlog reports, work assignment sheets, and hazard communication support. A major enhancement of the regulatory LIMS was the addition of features to support a {open_quotes}standardized{close_quotes} electronic data format for environmental data reporting. The electronic format, called {open_quotes}AN92{close_quotes}, was developed by the site environmental monitoring organization and applies to both onsite and offsite environmental analytical contracts. This format incorporates EPA CLP data validation codes as well as details holding time and analytical result reporting requirements. The authors support this format by using special SQL queries to the database. The data is then automatically transferred to the environmental databases for trending and geological mapping.

  17. Evaluating the Accuracy of Results for Teacher Implemented Trial-Based Functional Analyses.

    Science.gov (United States)

    Rispoli, Mandy; Ninci, Jennifer; Burke, Mack D; Zaini, Samar; Hatton, Heather; Sanchez, Lisa

    2015-09-01

    Trial-based functional analysis (TBFA) allows for the systematic and experimental assessment of challenging behavior in applied settings. The purposes of this study were to evaluate a professional development package focused on training three Head Start teachers to conduct TBFAs with fidelity during ongoing classroom routines. To assess the accuracy of the TBFA results, the effects of a function-based intervention derived from the TBFA were compared with the effects of a non-function-based intervention. Data were collected on child challenging behavior and appropriate communication. An A-B-A-C-D design was utilized in which A represented baseline, and B and C consisted of either function-based or non-function-based interventions counterbalanced across participants, and D represented teacher implementation of the most effective intervention. Results showed that the function-based intervention produced greater decreases in challenging behavior and greater increases in appropriate communication than the non-function-based intervention for all three children. © The Author(s) 2015.

  18. Dynamic analyses, FPGA implementation and engineering applications of multi-butterfly chaotic attractors generated from generalised Sprott C system

    Science.gov (United States)

    Lai, Qiang; Zhao, Xiao-Wen; Rajagopal, Karthikeyan; Xu, Guanghui; Akgul, Akif; Guleryuz, Emre

    2018-01-01

    This paper considers the generation of multi-butterfly chaotic attractors from a generalised Sprott C system with multiple non-hyperbolic equilibria. The system is constructed by introducing an additional variable whose derivative has a switching function to the Sprott C system. It is numerically found that the system creates two-, three-, four-, five-butterfly attractors and any other multi-butterfly attractors. First, the dynamic analyses of multi-butterfly chaotic attractors are presented. Secondly, the field programmable gate array implementation, electronic circuit realisation and random number generator are done with the multi-butterfly chaotic attractors.

  19. Secondary Data Analyses of Conclusions Drawn by the Program Implementers of a Positive Youth Development Program in Hong Kong

    Directory of Open Access Journals (Sweden)

    Andrew M. H. Siu

    2010-01-01

    Full Text Available The Tier 2 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes is designed for adolescents with significant psychosocial needs, and its various programs are designed and implemented by social workers (program implementers for specific student groups in different schools. Using subjective outcome evaluation data collected from the program participants (Form C at 207 schools, the program implementers were asked to aggregate data and write down five conclusions (n = 1,035 in their evaluation reports. The conclusions stated in the evaluation reports were further analyzed via secondary data analyses in this study. Results showed that the participants regarded the Tier 2 Program as a success, and was effective in enhancing self-understanding, interpersonal skills, and self-management. They liked the experiential learning approach and activities that are novel, interesting, diversified, adventure-based, and outdoor in nature. They also liked instructors who were friendly, supportive, well-prepared, and able to bring challenges and give positive recognition. Most of the difficulties encountered in running the programs were related to time constraints, clashes with other activities, and motivation of participants. Consistent with the previous evaluation findings, the present study suggests that the Tier 2 Program was well received by the participants and that it was beneficial to the development of the program participants.

  20. Combining Geoelectrical Measurements and CO2 Analyses to Monitor the Enhanced Bioremediation of Hydrocarbon-Contaminated Soils: A Field Implementation

    Directory of Open Access Journals (Sweden)

    Cécile Noel

    2016-01-01

    Full Text Available Hydrocarbon-contaminated aquifers can be successfully remediated through enhanced biodegradation. However, in situ monitoring of the treatment by piezometers is expensive and invasive and might be insufficient as the information provided is restricted to vertical profiles at discrete locations. An alternative method was tested in order to improve the robustness of the monitoring. Geophysical methods, electrical resistivity (ER and induced polarization (IP, were combined with gas analyses, CO2 concentration, and its carbon isotopic ratio, to develop a less invasive methodology for monitoring enhanced biodegradation of hydrocarbons. The field implementation of this monitoring methodology, which lasted from February 2014 until June 2015, was carried out at a BTEX-polluted site under aerobic biotreatment. Geophysical monitoring shows a more conductive and chargeable area which corresponds to the contaminated zone. In this area, high CO2 emissions have been measured with an isotopic signature demonstrating that the main source of CO2 on this site is the biodegradation of hydrocarbon fuels. Besides, the evolution of geochemical and geophysical data over a year seems to show the seasonal variation of bacterial activity. Combining geophysics with gas analyses is thus promising to provide a new methodology for in situ monitoring.

  1. Design and implementation of an automatic acquisition card with direct memory incrementing intended for a multichannel analyser

    International Nuclear Information System (INIS)

    Al-Ani, Tarik Hesen

    1984-01-01

    This study presents a contribution to the implementation of a multichannel analyser, based on recent technology in order to give elaborated results to the user. This instrument will be designed using modular cards compatible with an Intel, Multi-bus System. The main purpose of this thesis consists in the study and design of the logical card establishing automatically an histogram in the memory of a micro-computer (Direct Memory Increment or DMI). This card allows the connection of up to four analog to digital converters and does the incrementing of the data in memory at the address delivered by an analog to digital Wilkinson type converter (400 MHz) designed at CEA. It allows: - 4 independent inputs working simultaneously with an average input rate of 87500 events/second for 16000 channels of 2 32 bits and with a time resolution of 1 μs; - 3 acquisition modes: histogram, multi-scaling and list; - calculation of the real and live times independently for the 4 inputs. In addition, this card provides the interfacing capability for a line printer, a sample driver and a 'mouse'. (author) [fr

  2. Implementation of physical coordination training and cognitive behavioural training interventions at cleaning workplaces - secondary analyses of a randomised controlled trial

    DEFF Research Database (Denmark)

    Jørgensen, Marie B; Faber, Anne; Jespersen, Tobias

    2012-01-01

    intervention effects, more research on implementation is needed. Trial registration: ISRCTN96241850. Practitioner summary: Both physical coordination training and cognitive behavioural training are potential effective workplace interventions among low educated job groups with high physical work demands......This study evaluates the implementation of physical coordination training (PCT) and cognitive behavioural training (CBTr) interventions in a randomised controlled trial at nine cleaners' workplaces. Female cleaners (n = 294) were randomised into a PCT, a CBTr or a reference (REF) group. Both 12...

  3. Analysing Implementation of the European Standards and Guidelines for Quality Assurance at Institutional Level : Outcomes of the IBAR Project

    NARCIS (Netherlands)

    Westerheijden, Donald F.; Kohoutek, Jan

    2013-01-01

    The IBAR project studied barriers higher education institutions experienced to implementing the ESG part 1. Our paper reports on the major findings of this project. After sketching our conceptual approach, we conclude that the ESG Part 1 seem to be functioning as a codification of many policies and

  4. Implementation of an iron ore green pellet on-line size analyser at the QCMC pelletizing plant

    International Nuclear Information System (INIS)

    Bouajila, A.; Boivin, J.-A.; Ouellet, G.; Beaudin, S.

    1999-01-01

    This paper describes work into the design, implementation and performance evaluation of a 3D-image analysis system at the QCMC pelletizing plant. First, the measurement system is reviewed. Second, the ability of the system to achieve reliable, on-line results on a moving conveyor belt is presented and discussed. The problem of segregation caused by disk classification is particularly addressed, as it hinders full size distribution estimation from the top layer. Finally, pelletizing disk controllability is investigated. (author)

  5. Implementation of analyses based on social media data for marketing purposes in academic and scientific organizations in practice – opportunities and limitations

    Directory of Open Access Journals (Sweden)

    Magdalena Grabarczyk-Tokaj

    2013-12-01

    Full Text Available The article is focused on the issue of practice use of analyses, based on data collected in social media, for institutions’ communication and marketing purposes. The subject is being discussed from the perspective of Digital Darwinism — situation, when development of technologies and new means of communication is significantly faster than growth in the knowledge and digital skills among organizations eager to implement those solutions. To diminish negative consequences of Digital Darwinism institutions can broaden their knowledge with analyses of data from cyber space to optimize operations, and make use of running dialog and cooperation with prosuments to face dynamic changes in trends, technologies and society. Information acquired from social media user generated content can be employed as guidelines in planning, running and evaluating communication and marketing activities. The article presents examples of tools and solutions, that can be implement in practice as a support for actions taken by institutions.

  6. Implementing voice over Internet protocol in mobile ad hoc network – analysing its features regarding efficiency, reliability and security

    Directory of Open Access Journals (Sweden)

    Naveed Ahmed Sheikh

    2014-05-01

    Full Text Available Providing secure and efficient real-time voice communication in mobile ad hoc network (MANET environment is a challenging problem. Voice over Internet protocol (VoIP has originally been developed over the past two decades for infrastructure-based networks. There are strict timing constraints for acceptable quality VoIP services, in addition to registration and discovery issues in VoIP end-points. In MANETs, ad hoc nature of networks and multi-hop wireless environment with significant packet loss and delays present formidable challenges to the implementation. Providing a secure real-time VoIP service on MANET is the main design objective of this paper. The authors have successfully developed a prototype system that establishes reliable and efficient VoIP communication and provides an extremely flexible method for voice communication in MANETs. The authors’ cooperative mesh-based MANET implementation can be used for rapidly deployable VoIP communication with survivable and efficient dynamic networking using open source software.

  7. Implementation into a CFD code of neutron kinetics and fuel pin models for nuclear reactor transient analyses

    International Nuclear Information System (INIS)

    Chen Zhao; Chen, Xue-Nong; Rineiski, Andrei; Zhao Pengcheng; Chen Hongli

    2014-01-01

    Safety analysis is an important tool for justifying the safety of nuclear reactors. The traditional method for nuclear reactor safety analysis is performed by means of system codes, which use one-dimensional lumped-parameter method to model real reactor systems. However, there are many multi-dimensional thermal-hydraulic phenomena cannot be predicated using traditional one-dimensional system codes. This problem is extremely important for pool-type nuclear systems. Computational fluid dynamics (CFD) codes are powerful numerical simulation tools to solve multi-dimensional thermal-hydraulics problems, which are widely used in industrial applications for single phase flows. In order to use general CFD codes to solve nuclear reactor transient problems, some additional models beyond general ones are required. Neutron kinetics model for power calculation and fuel pin model for fuel pin temperature calculation are two important models of these additional models. The motivation of this work is to develop an advance numerical simulation method for nuclear reactor safety analysis by implementing neutron kinetics model and fuel pin model into general CFD codes. In this paper, the Point Kinetics Model (PKM) and Fuel Pin Model (FPM) are implemented into a general CFD code FLUENT. The improved FLUENT was called as FLUENT/PK. The mathematical models and implementary method of FLUENT/PK are descripted and two demonstration application cases, e.g. the unprotected transient overpower (UTOP) accident of a Liquid Metal cooled Fast Reactor (LMFR) and the unprotected beam overpower (UBOP) accident of an Accelerator Driven System (ADS), are presented. (author)

  8. Updated model for radionuclide transport in the near-surface till at Forsmark - Implementation of decay chains and sensitivity analyses

    International Nuclear Information System (INIS)

    Pique, Angels; Pekala, Marek; Molinero, Jorge; Duro, Lara; Trinchero, Paolo; Vries, Luis Manuel de

    2013-02-01

    The Forsmark area has been proposed for potential siting of a deep underground (geological) repository for radioactive waste in Sweden. Safety assessment of the repository requires radionuclide transport from the disposal depth to recipients at the surface to be studied quantitatively. The near-surface quaternary deposits at Forsmark are considered a pathway for potential discharge of radioactivity from the underground facility to the biosphere, thus radionuclide transport in this system has been extensively investigated over the last years. The most recent work of Pique and co-workers (reported in SKB report R-10-30) demonstrated that in case of release of radioactivity the near-surface sedimentary system at Forsmark would act as an important geochemical barrier, retarding the transport of reactive radionuclides through a combination of retention processes. In this report the conceptual model of radionuclide transport in the quaternary till at Forsmark has been updated, by considering recent revisions regarding the near-surface lithology. In addition, the impact of important conceptual assumptions made in the model has been evaluated through a series of deterministic and probabilistic (Monte Carlo) sensitivity calculations. The sensitivity study focused on the following effects: 1. Radioactive decay of 135 Cs, 59 Ni, 230 Th and 226 Ra and effects on their transport. 2. Variability in key geochemical parameters, such as the composition of the deep groundwater, availability of sorbing materials in the till, and mineral equilibria. 3. Variability in hydraulic parameters, such as the definition of hydraulic boundaries, and values of hydraulic conductivity, dispersivity and the deep groundwater inflow rate. The overarching conclusion from this study is that the current implementation of the model is robust (the model is largely insensitive to variations in the parameters within the studied ranges) and conservative (the Base Case calculations have a tendency to

  9. Updated model for radionuclide transport in the near-surface till at Forsmark - Implementation of decay chains and sensitivity analyses

    Energy Technology Data Exchange (ETDEWEB)

    Pique, Angels; Pekala, Marek; Molinero, Jorge; Duro, Lara; Trinchero, Paolo; Vries, Luis Manuel de [Amphos 21 Consulting S.L., Barcelona (Spain)

    2013-02-15

    The Forsmark area has been proposed for potential siting of a deep underground (geological) repository for radioactive waste in Sweden. Safety assessment of the repository requires radionuclide transport from the disposal depth to recipients at the surface to be studied quantitatively. The near-surface quaternary deposits at Forsmark are considered a pathway for potential discharge of radioactivity from the underground facility to the biosphere, thus radionuclide transport in this system has been extensively investigated over the last years. The most recent work of Pique and co-workers (reported in SKB report R-10-30) demonstrated that in case of release of radioactivity the near-surface sedimentary system at Forsmark would act as an important geochemical barrier, retarding the transport of reactive radionuclides through a combination of retention processes. In this report the conceptual model of radionuclide transport in the quaternary till at Forsmark has been updated, by considering recent revisions regarding the near-surface lithology. In addition, the impact of important conceptual assumptions made in the model has been evaluated through a series of deterministic and probabilistic (Monte Carlo) sensitivity calculations. The sensitivity study focused on the following effects: 1. Radioactive decay of {sup 135}Cs, {sup 59}Ni, {sup 230}Th and {sup 226}Ra and effects on their transport. 2. Variability in key geochemical parameters, such as the composition of the deep groundwater, availability of sorbing materials in the till, and mineral equilibria. 3. Variability in hydraulic parameters, such as the definition of hydraulic boundaries, and values of hydraulic conductivity, dispersivity and the deep groundwater inflow rate. The overarching conclusion from this study is that the current implementation of the model is robust (the model is largely insensitive to variations in the parameters within the studied ranges) and conservative (the Base Case calculations have a

  10. Guidelines to implement the license renewal technical requirements of 10CFR54 for integrated plant assessments and time-limited aging analyses. Final report

    International Nuclear Information System (INIS)

    Lehnert, G.; Philpot, L.

    1995-11-01

    This report documents the initial results of the Nuclear Energy Institute License Renewal Implementation Guideline Task Force over the period August 1994 to July 1995 to develop guidance for complying with technical requirements of 10CFR54. The report also provided a starting point for the development of NEI 95-10, ''Industry Guideline for Implementing the Requirements of 10CCR54-The License Renewal Rule''. Information in this document can be used by utilities to prepare the technical material needed in an application for license renewal (LR) of a nuclear power unit. This guideline provides methods for identifying systems, structures, and components (SSCs) and their intended functions within the scope of license renewal. It identifies structures and components (SCs) requiring aging management review and methods for performing the aging management review. The guideline provides a process for identifying and evaluating time-limited aging analyses

  11. Evaluation of the Tier 1 Program of Project P.A.T.H.S.: Secondary Data Analyses of Conclusions Drawn by the Program Implementers

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2008-01-01

    Full Text Available The Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes is a curricula-based positive youth development program. In the experimental implementation phase, 52 schools participated in the program. Based on subjective outcome evaluation data collected from the program participants (Form A and program implementers (Form B in each school, the program implementers were invited to write down five conclusions based on an integration of the evaluation findings (N = 52. The conclusions stated in the 52 evaluation reports were further analyzed via secondary data analyses in this paper. Results showed that most of the conclusions concerning perceptions of the Tier 1 Program, instructors, and effectiveness of the programs were positive in nature. There were also conclusions reflecting the respondents’ appreciation of the program. Finally, responses on the difficulties encountered and suggestions for improvements were observed. In conjunction with the previous evaluation findings, the present study suggests that the Tier 1 Program was well received by the stakeholders and the program was beneficial to the development of the program participants.

  12. Meta-regression analyses to explain statistical heterogeneity in a systematic review of strategies for guideline implementation in primary health care.

    Directory of Open Access Journals (Sweden)

    Susanne Unverzagt

    Full Text Available This study is an in-depth-analysis to explain statistical heterogeneity in a systematic review of implementation strategies to improve guideline adherence of primary care physicians in the treatment of patients with cardiovascular diseases. The systematic review included randomized controlled trials from a systematic search in MEDLINE, EMBASE, CENTRAL, conference proceedings and registers of ongoing studies. Implementation strategies were shown to be effective with substantial heterogeneity of treatment effects across all investigated strategies. Primary aim of this study was to explain different effects of eligible trials and to identify methodological and clinical effect modifiers. Random effects meta-regression models were used to simultaneously assess the influence of multimodal implementation strategies and effect modifiers on physician adherence. Effect modifiers included the staff responsible for implementation, level of prevention and definition pf the primary outcome, unit of randomization, duration of follow-up and risk of bias. Six clinical and methodological factors were investigated as potential effect modifiers of the efficacy of different implementation strategies on guideline adherence in primary care practices on the basis of information from 75 eligible trials. Five effect modifiers were able to explain a substantial amount of statistical heterogeneity. Physician adherence was improved by 62% (95% confidence interval (95% CI 29 to 104% or 29% (95% CI 5 to 60% in trials where other non-medical professionals or nurses were included in the implementation process. Improvement of physician adherence was more successful in primary and secondary prevention of cardiovascular diseases by around 30% (30%; 95% CI -2 to 71% and 31%; 95% CI 9 to 57%, respectively compared to tertiary prevention. This study aimed to identify effect modifiers of implementation strategies on physician adherence. Especially the cooperation of different health

  13. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    International Nuclear Information System (INIS)

    Ando, Masami; Sunaguchi, Naoki; Wu, Yanlin; Do, Synho; Sung, Yongjin; Gupta, Rajiv; Louissaint, Abner; Yuasa, Tetsuya; Ichihara, Shu

    2014-01-01

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  14. Crystal analyser-based X-ray phase contrast imaging in the dark field: implementation and evaluation using excised tissue specimens

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Masami [RIST, Tokyo University of Science, Noda, Chiba (Japan); Sunaguchi, Naoki [Gunma University, Graduate School of Engineering, Kiryu, Gunma (Japan); Wu, Yanlin [The Graduate University for Advanced Studies, Department of Materials Structure Science, School of High Energy Accelerator Science, Tsukuba, Ibaraki (Japan); Do, Synho; Sung, Yongjin; Gupta, Rajiv [Massachusetts General Hospital and Harvard Medical School, Department of Radiology, Boston, MA (United States); Louissaint, Abner [Massachusetts General Hospital and Harvard Medical School, Department of Pathology, Boston, MA (United States); Yuasa, Tetsuya [Yamagata University, Faculty of Engineering, Yonezawa, Yamagata (Japan); Ichihara, Shu [Nagoya Medical Center, Department of Pathology, Nagoya, Aichi (Japan)

    2014-02-15

    We demonstrate the soft tissue discrimination capability of X-ray dark-field imaging (XDFI) using a variety of human tissue specimens. The experimental setup for XDFI comprises an X-ray source, an asymmetrically cut Bragg-type monochromator-collimator (MC), a Laue-case angle analyser (LAA) and a CCD camera. The specimen is placed between the MC and the LAA. For the light source, we used the beamline BL14C on a 2.5-GeV storage ring in the KEK Photon Factory, Tsukuba, Japan. In the eye specimen, phase contrast images from XDFI were able to discriminate soft-tissue structures, such as the iris, separated by aqueous humour on both sides, which have nearly equal absorption. Superiority of XDFI in imaging soft tissue was further demonstrated with a diseased iliac artery containing atherosclerotic plaque and breast samples with benign and malignant tumours. XDFI on breast tumours discriminated between the normal and diseased terminal duct lobular unit and between invasive and in-situ cancer. X-ray phase, as detected by XDFI, has superior contrast over absorption for soft tissue processes such as atherosclerotic plaque and breast cancer. (orig.)

  15. Patient-specific stress analyses in the ascending thoracic aorta using a finite-element implementation of the constrained mixture theory.

    Science.gov (United States)

    Mousavi, S Jamaleddin; Avril, Stéphane

    2017-10-01

    It is now a rather common approach to perform patient-specific stress analyses of arterial walls using finite-element models reconstructed from gated medical images. However, this requires to compute for every Gauss point the deformation gradient between the current configuration and a stress-free reference configuration. It is technically difficult to define such a reference configuration, and there is actually no guarantee that a stress-free configuration is physically attainable due to the presence of internal stresses in unloaded soft tissues. An alternative framework was proposed by Bellini et al. (Ann Biomed Eng 42(3):488-502, 2014). It consists of computing the deformation gradients between the current configuration and a prestressed reference configuration. We present here the first finite-element results based on this concept using the Abaqus software. The reference configuration is set arbitrarily to the in vivo average geometry of the artery, which is obtained from gated medical images and is assumed to be mechanobiologically homeostatic. For every Gauss point, the stress is split additively into the contributions of each individual load-bearing constituent of the tissue, namely elastin, collagen, smooth muscle cells. Each constituent is assigned an independent prestretch in the reference configuration, named the deposition stretch. The outstanding advantage of the present approach is that it simultaneously computes the in situ stresses existing in the reference configuration and predicts the residual stresses that occur after removing the different loadings applied onto the artery (pressure and axial load). As a proof of concept, we applied it on an ideal thick-wall cylinder and showed that the obtained results were consistent with corresponding experimental and analytical results of the well-known literature. In addition, we developed a patient-specific model of a human ascending thoracic aneurysmal aorta and demonstrated the utility in predicting the

  16. Can hospital audit teams identify case management problems, analyse their causes, identify and implement improvements? A cross-sectional process evaluation of obstetric near-miss case reviews in Benin

    Directory of Open Access Journals (Sweden)

    Borchert Matthias

    2012-10-01

    Full Text Available Abstract Background Obstetric near-miss case reviews are being promoted as a quality assurance intervention suitable for hospitals in low income countries. We introduced such reviews in five district, regional and national hospitals in Benin, West Africa. In a cross-sectional study we analysed the extent to which the hospital audit teams were able to identify case management problems (CMPs, analyse their causes, agree on solutions and put these solutions into practice. Methods We analysed case summaries, women’s interview transcripts and audit minutes produced by the audit teams for 67 meetings concerning one woman with near-miss complications each. We compared the proportion of CMPs identified by an external assessment team to the number found by the audit teams. For the latter, we described the CMP causes identified, solutions proposed and implemented by the audit teams. Results Audit meetings were conducted regularly and were well attended. Audit teams identified half of the 714 CMPs; they were more likely to find managerial ones (71% than the ones relating to treatment (30%. Most identified CMPs were valid. Almost all causes of CMPs were plausible, but often too superficial to be of great value for directing remedial action. Audit teams suggested solutions, most of them promising ones, for 38% of the CMPs they had identified, but recorded their implementation only for a minority (8.5%. Conclusions The importance of following-up and documenting the implementation of solutions should be stressed in future audit interventions. Tools facilitating the follow-up should be made available. Near-miss case reviews hold promise, but their effectiveness to improve the quality of care sustainably and on a large scale still needs to be established.

  17. Can hospital audit teams identify case management problems, analyse their causes, identify and implement improvements? A cross-sectional process evaluation of obstetric near-miss case reviews in Benin

    Science.gov (United States)

    2012-01-01

    Background Obstetric near-miss case reviews are being promoted as a quality assurance intervention suitable for hospitals in low income countries. We introduced such reviews in five district, regional and national hospitals in Benin, West Africa. In a cross-sectional study we analysed the extent to which the hospital audit teams were able to identify case management problems (CMPs), analyse their causes, agree on solutions and put these solutions into practice. Methods We analysed case summaries, women’s interview transcripts and audit minutes produced by the audit teams for 67 meetings concerning one woman with near-miss complications each. We compared the proportion of CMPs identified by an external assessment team to the number found by the audit teams. For the latter, we described the CMP causes identified, solutions proposed and implemented by the audit teams. Results Audit meetings were conducted regularly and were well attended. Audit teams identified half of the 714 CMPs; they were more likely to find managerial ones (71%) than the ones relating to treatment (30%). Most identified CMPs were valid. Almost all causes of CMPs were plausible, but often too superficial to be of great value for directing remedial action. Audit teams suggested solutions, most of them promising ones, for 38% of the CMPs they had identified, but recorded their implementation only for a minority (8.5%). Conclusions The importance of following-up and documenting the implementation of solutions should be stressed in future audit interventions. Tools facilitating the follow-up should be made available. Near-miss case reviews hold promise, but their effectiveness to improve the quality of care sustainably and on a large scale still needs to be established. PMID:23057707

  18. Dynamic analyses, FPGA implementation and engineering ...

    Indian Academy of Sciences (India)

    QIANG LAI

    2017-12-14

    Dec 14, 2017 ... the model of the generalised Sprott C system and anal- yses its equilibria. ..... the Matlab simulations. ... RNG design processes are given in Algorithm 1 as the ... RNG applications (simulation, modelling, arts, data hiding ...

  19. Analyse that

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Maurice

    2011-06-15

    The oil industry is starting to implement new technologies such as horizontal drilling and multistage fracking to light oil production. Most of the producers are copying what is done by their coounterparts. Experts say that another approach should be taken because you can get quicker results with a technical analysis using an analytical model than by drilling a lot of wells. In general, producers are also eager to put too many fracs into the ground to inflate initial production rates but this does not increase the cumulative recovery so they are spending more money to end up with the same result. The oil industry still has to work to find a way to optimize production and reservoir management and costs.

  20. Laser Beam Focus Analyser

    DEFF Research Database (Denmark)

    Nielsen, Peter Carøe; Hansen, Hans Nørgaard; Olsen, Flemming Ove

    2007-01-01

    the obtainable features in direct laser machining as well as heat affected zones in welding processes. This paper describes the development of a measuring unit capable of analysing beam shape and diameter of lasers to be used in manufacturing processes. The analyser is based on the principle of a rotating......The quantitative and qualitative description of laser beam characteristics is important for process implementation and optimisation. In particular, a need for quantitative characterisation of beam diameter was identified when using fibre lasers for micro manufacturing. Here the beam diameter limits...... mechanical wire being swept through the laser beam at varying Z-heights. The reflected signal is analysed and the resulting beam profile determined. The development comprised the design of a flexible fixture capable of providing both rotation and Z-axis movement, control software including data capture...

  1. Characterization of Pollution Transport into Texas Using OMI and TES Satellite and In Situ data, and HYSPLIT Back Trajectory Analyses: implications for TCEQ State Implementation Plans and High School/Undergraduate STEM Education

    Science.gov (United States)

    Boxe, C.; Bella, D.; Khaimova, J.; Culpepper, J.; Ahmed, N.; Belkalai, A.; Ealy, J.; Arroyo, I.; Lahoumh, M.; Jenkins, O.; Emmanuel, S.; Andrews, J.; Fu, D.; Wu, L.; Choi, Y.; Morris, G.; Osterman, G. B.; Johnson, L. P.; Austin, S. A.

    2014-12-01

    Using an online trajectory analysis tool NASA, ArcGIS, Satellite and EPA in situ data, we assess whether high pollution events in Texas are primarily sourced locally or remotely. We focus satellite data that exemplify high O3 and NO2 over Texas's lower troposphere. Four day back trajectory analyses of all dates show that upper-, mid-, and lower-tropospheric air over Texas, containing high O3, is transported from the Gulf of Mexico, Southeast USA, Midwest USA, Northeast USA, the Atlantic Ocean, Pacific Ocean, Mexico, etc. Only day showed air at 1 km is sourced within Texas. Satellite data show O3 enhancements in the boundary layer and O3 and NO2 enhancements via tropospheric column profiles. These enhancements complement four-day trajectory analysis. This study provides a viable basis for more quantifiable and accurate information for developing effective air quality State Implementation Plans. STEM Impact: (i) D. Bella was an NSF-LSAMP undergraduate research mentee with me at Medgar Evers College-CUNY; she received a B.S. in Environmental Science (and a Chemistry Minor) and is now a Ph.D. graduate student at University at Albany's School of Public Health. (ii) J. Khaimova is an undergraduate Geology and Planetary Science B.S. major at Brooklyn College-CUNY. I have supported Jessica's summer internship in summer 2013 as a CUNY Summer Research Fellow, where she is currently an NSF-REU research mentee at Pennsylvania State University's Meteorology Department. (iii) J. Culpepper received his B.S. in Environmental Science from MEC-CUNY and will be a Ph.D. student, Fall 2014 at University of Iowa's Civil and Environmental Engineering Department. (iv) S. Gentle was a high school researcher with me within ACS's Project SEED Program for high school students. S. Gentle will start her undergraduate career Fall 2014 at Pennsylvania State University and seeks to attain a B.S. in Chemistry. (v). All parties, including high school and undergraduate researchers seek to attend

  2. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    Science.gov (United States)

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A posteriori error analysis for hydro-mechanical couplings and implementation in Code-Aster; Analyse d'erreur a posteriori pour les couplages hydro-mecaniques et mise en oeuvre dans Code-Aster

    Energy Technology Data Exchange (ETDEWEB)

    Meunier, S

    2007-11-15

    We analyse approximations by finite elements in space and finite differences in time of coupled Hydro-Mechanical (HM) problems related to the quasi-static linear poro-elasticity theory. The physical bases of this theory are briefly restated and an abstract setting is proposed to perform the mathematical study of the stationary and un-stationary versions of the HM problem. For the stationary version, the well-posedness of the continuous and discrete problems are established and the a priori error analysis is performed. Then, we propose the a posteriori error analysis by using two different techniques suited to estimate the displacement error and the pressure error, respectively, both in the H{sub x}{sup 1}-norm. The classical properties of reliability and optimality are proved for the associated error estimators. Some numerical experiments using Code-Aster illustrate the theoretical results. For the un-stationary version, we first establish a stability result for the continuous problem. Then, we present an optimal a priori error analysis using elliptic projection techniques. Finally, the a posteriori error analysis is performed by using two different approaches: a direct approach and an elliptic reconstruction approach. The first is suited to estimate the pressure error in the L{sub t}{sup 2}(H{sub x}{sup 1})-norm and the second is suited to estimate the displacement error in the L{sub t}{sup {infinity}}(H{sub x}{sup 1})-norm and the pressure error in the L{sub t}{sup {infinity}}(H{sub x}{sup 1})-norm. Numerical experiments using Code-Aster complete the theoretical results. (author)

  4. An MDE Approach for Modular Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Aksit, Mehmet; Rensink, Arend

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java, and access the program to be analyzed through libraries that offer an API for reading, writing

  5. Analysing Protocol Stacks for Services

    DEFF Research Database (Denmark)

    Gao, Han; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    We show an approach, CaPiTo, to model service-oriented applications using process algebras such that, on the one hand, we can achieve a certain level of abstraction without being overwhelmed by the underlying implementation details and, on the other hand, we respect the concrete industrial...... standards used for implementing the service-oriented applications. By doing so, we will be able to not only reason about applications at different levels of abstractions, but also to build a bridge between the views of researchers on formal methods and developers in industry. We apply our approach...... to the financial case study taken from Chapter 0-3. Finally, we develop a static analysis to analyse the security properties as they emerge at the level of concrete industrial protocols....

  6. Pilot Implementations

    DEFF Research Database (Denmark)

    Manikas, Maria Ie

    by conducting a literature review. The concept of pilot implementation, although commonly used in practice, is rather disregarded in research. In the literature, pilot implementations are mainly treated as secondary to the learning outcomes and are presented as merely a means to acquire knowledge about a given...... objective. The prevalent understanding is that pilot implementations are an ISD technique that extends prototyping from the lab and into test during real use. Another perception is that pilot implementations are a project multiple of co-existing enactments of the pilot implementation. From this perspective......This PhD dissertation engages in the study of pilot (system) implementation. In the field of information systems, pilot implementations are commissioned as a way to learn from real use of a pilot system with real data, by real users during an information systems development (ISD) project and before...

  7. Implementing optimal thinning strategies

    Science.gov (United States)

    Kurt H. Riitters; J. Douglas Brodie

    1984-01-01

    Optimal thinning regimes for achieving several management objectives were derived from two stand-growth simulators by dynamic programming. Residual mean tree volumes were then plotted against stand density management diagrams. The results supported the use of density management diagrams for comparing, checking, and implementing the results of optimization analyses....

  8. Pilot implementation

    DEFF Research Database (Denmark)

    Hertzum, Morten; Bansler, Jørgen P.; Havn, Erling C.

    2012-01-01

    A recurrent problem in information-systems development (ISD) is that many design shortcomings are not detected during development, but first after the system has been delivered and implemented in its intended environment. Pilot implementations appear to promise a way to extend prototyping from...... the laboratory to the field, thereby allowing users to experience a system design under realistic conditions and developers to get feedback from realistic use while the design is still malleable. We characterize pilot implementation, contrast it with prototyping, propose a iveelement model of pilot...... implementation and provide three empirical illustrations of our model. We conclude that pilot implementation has much merit as an ISD technique when system performance is contingent on context. But we also warn developers that, despite their seductive conceptual simplicity, pilot implementations can be difficult...

  9. Contesting Citizenship: Comparative Analyses

    DEFF Research Database (Denmark)

    Siim, Birte; Squires, Judith

    2007-01-01

    importance of particularized experiences and multiple ineequality agendas). These developments shape the way citizenship is both practiced and analysed. Mapping neat citizenship modles onto distinct nation-states and evaluating these in relation to formal equality is no longer an adequate approach....... Comparative citizenship analyses need to be considered in relation to multipleinequalities and their intersections and to multiple governance and trans-national organisinf. This, in turn, suggests that comparative citizenship analysis needs to consider new spaces in which struggles for equal citizenship occur...

  10. Treaty implementation

    International Nuclear Information System (INIS)

    Dunn, L.A.

    1990-01-01

    This paper touches on three aspects of the relationship between intelligence and treaty implementation, a two-way association. First the author discusses the role of intelligence as a basis for compliance monitoring and treaty verification. Second the authors discusses payoffs of intelligence gathering and the intelligence process of treaty implementation, in particular on-site inspection. Third, the author goes in another direction and discusses some of the tensions between the intelligence gathering and treaty-implementation processes, especially with regard to extensive use of on-site inspection, such as we are likely to see in monitoring compliance of future arms control treaties

  11. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  12. Risico-analyse brandstofpontons

    NARCIS (Netherlands)

    Uijt de Haag P; Post J; LSO

    2001-01-01

    Voor het bepalen van de risico's van brandstofpontons in een jachthaven is een generieke risico-analyse uitgevoerd. Er is een referentiesysteem gedefinieerd, bestaande uit een betonnen brandstofponton met een relatief grote inhoud en doorzet. Aangenomen is dat de ponton gelegen is in een

  13. Fast multichannel analyser

    Energy Technology Data Exchange (ETDEWEB)

    Berry, A; Przybylski, M M; Sumner, I [Science Research Council, Daresbury (UK). Daresbury Lab.

    1982-10-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10/sup 7/ s/sup -1/ has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format.

  14. A fast multichannel analyser

    International Nuclear Information System (INIS)

    Berry, A.; Przybylski, M.M.; Sumner, I.

    1982-01-01

    A fast multichannel analyser (MCA) capable of sampling at a rate of 10 7 s -1 has been developed. The instrument is based on an 8 bit parallel encoding analogue to digital converter (ADC) reading into a fast histogramming random access memory (RAM) system, giving 256 channels of 64 k count capacity. The prototype unit is in CAMAC format. (orig.)

  15. Applied mediation analyses

    DEFF Research Database (Denmark)

    Lange, Theis; Hansen, Kim Wadt; Sørensen, Rikke

    2017-01-01

    In recent years, mediation analysis has emerged as a powerful tool to disentangle causal pathways from an exposure/treatment to clinically relevant outcomes. Mediation analysis has been applied in scientific fields as diverse as labour market relations and randomized clinical trials of heart...... disease treatments. In parallel to these applications, the underlying mathematical theory and computer tools have been refined. This combined review and tutorial will introduce the reader to modern mediation analysis including: the mathematical framework; required assumptions; and software implementation...

  16. Analysis of photovoltaic systems. Leadership/cooperation in Task II of the IEA Implementing Agreements Photovoltaic Power Systems, database operation; Analyse des Betriebsverhaltens von Photovoltaiksystemen. Leitung/Mitarbeit im Task II des IEA Implementing Agreements Photovoltaic Power Systems, Betrieb der Datenbank. Schlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Schreitmueller, K.; Niemann, M.; Decker, B.; Jahn, U.; Meyer, H.

    2000-02-01

    In order to state on the operational performance of PV systems and to develop guidelines for sizing and design optimisation, this project has been initiated in 1995 with the general objective to develop a database on PV power systems to provide PV experts and other target groups with suitable information on the operation of PV systems and subsystems. At present the database contains more than 260 systems of different types (grid connected, stand-alone, hybrid) adapted to various applications (power supply, domestic uses, rural electrification, professional applications). Detailed system characteristics of selected PV plants as well as monitored data are stored in the database. The data are made available to the user through internal graphical displays and reports or by exporting the data into a standard spread sheet programme. This tool can also be used to check the operational behaviour of existing PV plants and to get a report on its performance expressed in standard quantities allowing any kind of crossed comparison between systems. The implemented PV systems are located world wide and have been operated under different climatic conditions. A collection of such an amount of various operational data can be considered as a unique tool for PV system performance analysis. The results are very different depending on the type of systems. The analysis has been carried out using quantities such as reference and final yields, system and capture losses and performance ratio linked to the system availability. In the case of stand-alone systems, different factors such as the matching factor (performance ratio x solar fraction) and the usage factor (energy supplied by the PV array/potential PV energy) have been introduced to better quantify the system behaviour in a technical viewpoint and if necessary to define a ranking procedure. This report summarises the most important features and capabilities of the database and related toolbox. The most relevant results drawn from

  17. Possible future HERA analyses

    International Nuclear Information System (INIS)

    Geiser, Achim

    2015-12-01

    A variety of possible future analyses of HERA data in the context of the HERA data preservation programme is collected, motivated, and commented. The focus is placed on possible future analyses of the existing ep collider data and their physics scope. Comparisons to the original scope of the HERA pro- gramme are made, and cross references to topics also covered by other participants of the workshop are given. This includes topics on QCD, proton structure, diffraction, jets, hadronic final states, heavy flavours, electroweak physics, and the application of related theory and phenomenology topics like NNLO QCD calculations, low-x related models, nonperturbative QCD aspects, and electroweak radiative corrections. Synergies with other collider programmes are also addressed. In summary, the range of physics topics which can still be uniquely covered using the existing data is very broad and of considerable physics interest, often matching the interest of results from colliders currently in operation. Due to well-established data and MC sets, calibrations, and analysis procedures the manpower and expertise needed for a particular analysis is often very much smaller than that needed for an ongoing experiment. Since centrally funded manpower to carry out such analyses is not available any longer, this contribution not only targets experienced self-funded experimentalists, but also theorists and master-level students who might wish to carry out such an analysis.

  18. Biomass feedstock analyses

    Energy Technology Data Exchange (ETDEWEB)

    Wilen, C.; Moilanen, A.; Kurkela, E. [VTT Energy, Espoo (Finland). Energy Production Technologies

    1996-12-31

    The overall objectives of the project `Feasibility of electricity production from biomass by pressurized gasification systems` within the EC Research Programme JOULE II were to evaluate the potential of advanced power production systems based on biomass gasification and to study the technical and economic feasibility of these new processes with different type of biomass feed stocks. This report was prepared as part of this R and D project. The objectives of this task were to perform fuel analyses of potential woody and herbaceous biomasses with specific regard to the gasification properties of the selected feed stocks. The analyses of 15 Scandinavian and European biomass feed stock included density, proximate and ultimate analyses, trace compounds, ash composition and fusion behaviour in oxidizing and reducing atmospheres. The wood-derived fuels, such as whole-tree chips, forest residues, bark and to some extent willow, can be expected to have good gasification properties. Difficulties caused by ash fusion and sintering in straw combustion and gasification are generally known. The ash and alkali metal contents of the European biomasses harvested in Italy resembled those of the Nordic straws, and it is expected that they behave to a great extent as straw in gasification. Any direct relation between the ash fusion behavior (determined according to the standard method) and, for instance, the alkali metal content was not found in the laboratory determinations. A more profound characterisation of the fuels would require gasification experiments in a thermobalance and a PDU (Process development Unit) rig. (orig.) (10 refs.)

  19. Elusive Implementation

    DEFF Research Database (Denmark)

    Heering Holt, Ditte; Rod, Morten Hulvej; Waldorff, Susanne Boch

    2018-01-01

    in health. However, despite growing support for intersectoral policymaking, implementation remains a challenge. Critics argue that public health has remained naïve about the policy process and a better understanding is needed. Based on ethnographic data, this paper conducts an in-depth analysis of a local......: On the basis of an explorative study among ten Danish municipalities, we conducted an ethnographic study of the development of a municipal-wide implementation strategy for the intersectoral health policy of a medium-sized municipality. The main data sources consist of ethnographic field notes from participant...

  20. AMS analyses at ANSTO

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, E.M. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia). Physics Division

    1998-03-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with {sup 14}C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for {sup 14}C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent`s indigenous Aboriginal peoples. (author)

  1. AMS analyses at ANSTO

    International Nuclear Information System (INIS)

    Lawson, E.M.

    1998-01-01

    The major use of ANTARES is Accelerator Mass Spectrometry (AMS) with 14 C being the most commonly analysed radioisotope - presently about 35 % of the available beam time on ANTARES is used for 14 C measurements. The accelerator measurements are supported by, and dependent on, a strong sample preparation section. The ANTARES AMS facility supports a wide range of investigations into fields such as global climate change, ice cores, oceanography, dendrochronology, anthropology, and classical and Australian archaeology. Described here are some examples of the ways in which AMS has been applied to support research into the archaeology, prehistory and culture of this continent's indigenous Aboriginal peoples. (author)

  2. Une analyse des représentations des enseignants réfléchissant sur une expérience d’implantation d’un portfolio électronique / An Analysis of the Perceptions of Teachers Reflecting on an Experience of e-Portfolio Implementation

    Directory of Open Access Journals (Sweden)

    Ann-Louise Davidson

    2012-08-01

    Full Text Available Cet article rend compte d’une analyse des représentations avec des enseignants, qui a eu lieu à la fin d’une expérience d’implantation d’un portfolio électronique favorisant l’autorégulation des apprentissages des élèves et la professionnalisation des enseignants. Le texte s’appuie sur le contexte éducatif québécois actuel. Ensuite, il présente une revue de littérature entourant les compétences technologiques, les portfolios électroniques et l’autorégulation des apprentissages. La méthodologie présente le design de la recherche, les instruments utilisés pour les entretiens et pour animer le groupe de discussion. Les résultats montrent que pour une enseignante qui avait déjà adopté la pédagogie sous-jacente au renouveau pédagogique Québécois, le portfolio électronique était une expérience fort utile. Toutefois, pour les deux autres enseignants, l’implantation du portfolio électronique était beaucoup trop exigeante. Finalement, nous discutons des implications des résultats, autant au point de vue de l’expérience d’implantation des portfolios électroniques que du point de vue de la contribution méthodologique. This article presents an analysis of the perceptions of teachers, which took place in 2010 at the end of an experiment implementing an e-portfolio that facilitates student self-regulation of learning. The article is based on the current Quebec educational context. It presents a review of literature on technology skills, e-portfolios and the self-regulation of learning. The methodology describes the research design, the tools used in the individual interviews and in the group discussion. The results indicate that the e-portfolio was a valuable experience for a teacher who had already adopted the pedagogy underlying the education reform in Quebec. However the implementation of the e-portfolio was difficult for the other two teachers due to challenges in terms of technology and managing

  3. Analyses of MHD instabilities

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki

    1985-01-01

    In this article analyses of the MHD stabilities which govern the global behavior of a fusion plasma are described from the viewpoint of the numerical computation. First, we describe the high accuracy calculation of the MHD equilibrium and then the analysis of the linear MHD instability. The former is the basis of the stability analysis and the latter is closely related to the limiting beta value which is a very important theoretical issue of the tokamak research. To attain a stable tokamak plasma with good confinement property it is necessary to control or suppress disruptive instabilities. We, next, describe the nonlinear MHD instabilities which relate with the disruption phenomena. Lastly, we describe vectorization of the MHD codes. The above MHD codes for fusion plasma analyses are relatively simple though very time-consuming and parts of the codes which need a lot of CPU time concentrate on a small portion of the codes, moreover, the codes are usually used by the developers of the codes themselves, which make it comparatively easy to attain a high performance ratio on the vector processor. (author)

  4. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  5. Implementation Politics

    DEFF Research Database (Denmark)

    Hegland, Troels Jacob; Raakjær, Jesper

    2008-01-01

    level are supplemented or even replaced by national priorities. The chapter concludes that in order to capture the domestic politics associated with CFP implementation in Denmark, it is important to understand the policy process as a synergistic interaction between dominant interests, policy alliances...

  6. A simple beam analyser

    International Nuclear Information System (INIS)

    Lemarchand, G.

    1977-01-01

    (ee'p) experiments allow to measure the missing energy distribution as well as the momentum distribution of the extracted proton in the nucleus versus the missing energy. Such experiments are presently conducted on SACLAY's A.L.S. 300 Linac. Electrons and protons are respectively analysed by two spectrometers and detected in their focal planes. Counting rates are usually low and include time coincidences and accidentals. Signal-to-noise ratio is dependent on the physics of the experiment and the resolution of the coincidence, therefore it is mandatory to get a beam current distribution as flat as possible. Using new technologies has allowed to monitor in real time the behavior of the beam pulse and determine when the duty cycle can be considered as being good with respect to a numerical basis

  7. EEG analyses with SOBI.

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R.; Tang, Akaysha (University of New Mexico, Albuquerque, NM)

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  8. Pathway-based analyses.

    Science.gov (United States)

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  9. Analysing Access Control Specifications

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2009-01-01

    When prosecuting crimes, the main question to answer is often who had a motive and the possibility to commit the crime. When investigating cyber crimes, the question of possibility is often hard to answer, as in a networked system almost any location can be accessed from almost anywhere. The most...... common tool to answer this question, analysis of log files, faces the problem that the amount of logged data may be overwhelming. This problems gets even worse in the case of insider attacks, where the attacker’s actions usually will be logged as permissible, standard actions—if they are logged at all....... Recent events have revealed intimate knowledge of surveillance and control systems on the side of the attacker, making it often impossible to deduce the identity of an inside attacker from logged data. In this work we present an approach that analyses the access control configuration to identify the set...

  10. Network class superposition analyses.

    Directory of Open Access Journals (Sweden)

    Carl A B Pearson

    Full Text Available Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30 for the yeast cell cycle process, considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  11. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  12. Implementation Strategy

    Science.gov (United States)

    1983-01-01

    Meeting the identified needs of Earth science requires approaching EOS as an information system and not simply as one or more satellites with instruments. Six elements of strategy are outlined as follows: implementation of the individual discipline missions as currently planned; use of sustained observational capabilities offered by operational satellites without waiting for the launch of new mission; put first priority on the data system; deploy an Advanced Data Collection and Location System; put a substantial new observing capability in a low Earth orbit in such a way as to provide for sustained measurements; and group instruments to exploit their capabilities for synergism; maximize the scientific utility of the mission; and minimize the costs of implementation where possible.

  13. Implementing Pseudonymity

    Directory of Open Access Journals (Sweden)

    Miranda Mowbray

    2006-03-01

    Full Text Available I will give an overview of some technologies that enable pseudonymity - allowing individuals to reveal or prove information about themselves to others without revealing their full identity. I will describe some functionalities relating to pseudonymity that can be implemented, and some that cannot. My intention is to present enough of the mathematics that underlies technology for pseudonymity to show that it is indeed possible to implement some functionalities that at first glance may appear impossible. In particular, I will show that several of the intended functions of the UK national ID could be provided in a pseudonymous fashion, allowing greater privacy. I will also outline some technology developed at HP Labs which ensures that users’ personal data is released only to software that has been checked to conform to their preferred privacy policies.

  14. Fusion Implementation

    International Nuclear Information System (INIS)

    Schmidt, J.A.

    2002-01-01

    If a fusion DEMO reactor can be brought into operation during the first half of this century, fusion power production can have a significant impact on carbon dioxide production during the latter half of the century. An assessment of fusion implementation scenarios shows that the resource demands and waste production associated with these scenarios are manageable factors. If fusion is implemented during the latter half of this century it will be one element of a portfolio of (hopefully) carbon dioxide limiting sources of electrical power. It is time to assess the regional implications of fusion power implementation. An important attribute of fusion power is the wide range of possible regions of the country, or countries in the world, where power plants can be located. Unlike most renewable energy options, fusion energy will function within a local distribution system and not require costly, and difficult, long distance transmission systems. For example, the East Coast of the United States is a prime candidate for fusion power deployment by virtue of its distance from renewable energy sources. As fossil fuels become less and less available as an energy option, the transmission of energy across bodies of water will become very expensive. On a global scale, fusion power will be particularly attractive for regions separated from sources of renewable energy by oceans

  15. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  16. Website-analyse

    DEFF Research Database (Denmark)

    Thorlacius, Lisbeth

    2009-01-01

    eller blindgyder, når han/hun besøger sitet. Studier i design og analyse af de visuelle og æstetiske aspekter i planlægning og brug af websites har imidlertid kun i et begrænset omfang været under reflektorisk behandling. Det er baggrunden for dette kapitel, som indleder med en gennemgang af æstetikkens......Websitet er i stigende grad det foretrukne medie inden for informationssøgning,virksomhedspræsentation, e-handel, underholdning, undervisning og social kontakt. I takt med denne voksende mangfoldighed af kommunikationsaktiviteter på nettet, er der kommet mere fokus på at optimere design og...... planlægning af de funktionelle og indholdsmæssige aspekter ved websites. Der findes en stor mængde teori- og metodebøger, som har specialiseret sig i de tekniske problemstillinger i forbindelse med interaktion og navigation, samt det sproglige indhold på websites. Den danske HCI (Human Computer Interaction...

  17. A channel profile analyser

    International Nuclear Information System (INIS)

    Gobbur, S.G.

    1983-01-01

    It is well understood that due to the wide band noise present in a nuclear analog-to-digital converter, events at the boundaries of adjacent channels are shared. It is a difficult and laborious process to exactly find out the shape of the channels at the boundaries. A simple scheme has been developed for the direct display of channel shape of any type of ADC on a cathode ray oscilliscope display. This has been accomplished by sequentially incrementing the reference voltage of a precision pulse generator by a fraction of a channel and storing ADC data in alternative memory locations of a multichannel pulse height analyser. Alternative channels are needed due to the sharing at the boundaries of channels. In the flat region of the profile alternate memory locations are channels with zero counts and channels with the full scale counts. At the boundaries all memory locations will have counts. The shape of this is a direct display of the channel boundaries. (orig.)

  18. Analysis and Experimental Implementation of a Heuristic Strategy for Onboard Energy Management of a Hybrid Solar Vehicle Analyse et expérimentation d’une stratégie heuristique pour la gestion d’énergie à bord d’un véhicule hybride solaire

    Directory of Open Access Journals (Sweden)

    Coraggio G.

    2013-05-01

    Full Text Available This paper focuses on the simulation analysis and the experimental implementation of a Rule-Based (RB control strategy for on-board energy management of a Hybrid Solar Vehicle (HSV, consisting in a series hybrid electric vehicle assisted by photovoltaic panels. The RB strategy consists of two tasks: one external, which determines the final battery State of Charge (SOC to be reached at the end of the driving schedule to allow full exploitation of solar energy during parking phase; the other internal, whose aim is to define the optimal Electric Generator (ICE-EG power trajectory and SOC oscillation around the final value. This control strategy has been implemented in a real time NI® cRIO control unit, thus allowing to perform experimental tests for energy management validation on a real HSV prototype developed at the University of Salerno. Ce document présente l’analyse et la mise en oeuvre d’expérimentation de règles bases RB (Rule Base de stratégie de contrôle pour la gestion d’énergie à bord d’un véhicule hybride solaire HSV (Hybrid Solar Vehicle qui est constitué d’un véhicule hybride électrique fabriqué en série et alimenté par des panneaux photovoltaïques. La stratégie RB se compose de deux tâches : l’une externe, qui détermine l’état final de charge de la batterie (SOC, State of Charge qui doit être atteint à la fin du cycle de conduite pour permettre la pleine exploitation de l’énergie solaire pendant la phase de stationnement, l’autre interne, dont le but est de définir le générateur électrique optimal (ICEEG, Internal Combustion Engine – Electric Generator, la trajectoire de la puissance et l’oscillation du SOC autour de la valeur finale. Cette stratégie de contrôle a été mise en oeuvre en temps réel dans une unité de contrôle NI®cRIO (National Instruments compact RIO, permettant ainsi d’effectuer des essais expérimentaux pour la validation de la gestion d’énergie sur un

  19. Chapter No.4. Safety analyses

    International Nuclear Information System (INIS)

    2002-01-01

    for NPP V-1 Bohunice and on review of the impact of the modelling of selected components to the results of calculation safety analysis (a sensitivity study for NPP Mochovce). In 2001 UJD joined a new European project Alternative Approaches to the Safety Performance Indicators. The project is aimed at the information collecting and determining of approaches and recommendations for implementation of the risk oriented indicators, identification of the impact of the safety culture level and organisational culture on safety and applying of indicators to the needs of regulators and operators. In frame of the PHARE project UJD participated in the task focused on severe accident mitigation for nuclear power plants with VVER-440/V213 units. The main results of the analyses of nuclear power plants responses to severe accidents were summarised and the state of their analytical base performed in the past was evaluated within the project. Possible severe accident mitigation and preventative measures were proposed and their applicability for the nuclear power plants with VVER-440/V213 was investigated. The obtained results will be used in assessment activities and accident management of UJD. UJD has been involved also in EVITA project which makes a part of the 5 th EC Framework Programme. The project aims at validation of the European computer code ASTEC dedicated for severe accidents modelling. In 2001 the ASTEC computer code was tested on different platforms. The results of the testing are summarised in the technical report of EC issued in September 2001. Further activities within this project were focused on performing of selected accident scenarios analyses and comparison of the obtained results with the analyses realised with the help of other computer codes. The work on the project will continue in 2002. In 2001 a groundwork on establishing the Centre for Nuclear Safety in Central and Eastern Europe (CENS), the seat of which is going to be in Bratislava, has continued. The

  20. NOAA's National Snow Analyses

    Science.gov (United States)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  1. Advanced toroidal facility vaccuum vessel stress analyses

    International Nuclear Information System (INIS)

    Hammonds, C.J.; Mayhall, J.A.

    1987-01-01

    The complex geometry of the Advance Toroidal Facility (ATF) vacuum vessel required special analysis techniques in investigating the structural behavior of the design. The response of a large-scale finite element model was found for transportation and operational loading. Several computer codes and systems, including the National Magnetic Fusion Energy Computer Center Cray machines, were implemented in accomplishing these analyses. The work combined complex methods that taxed the limits of both the codes and the computer systems involved. Using MSC/NASTRAN cyclic-symmetry solutions permitted using only 1/12 of the vessel geometry to mathematically analyze the entire vessel. This allowed the greater detail and accuracy demanded by the complex geometry of the vessel. Critical buckling-pressure analyses were performed with the same model. The development, results, and problems encountered in performing these analyses are described. 5 refs., 3 figs

  2. Micromechanical Analyses of Sturzstroms

    Science.gov (United States)

    Imre, Bernd; Laue, Jan; Springman, Sarah M.

    2010-05-01

    have been made observable and reproducible within a physical and a distinct element numerical modelling environment (DEM). As link between field evidence gained from the deposits of natural sturzstroms, the physical model within the ETH Geotechnical Drum Centrifuge (Springman et al., 2001) and the numerical model PFC-3D (Cundall and Strack, 1979; Itasca, 2005), serves a deterministic fractal analytical comminution model (Sammis et al., 1987; Steacy and Sammis, 1991). This approach allowed studying the effects of dynamic fragmentation within sturzstroms at true (macro) scale within the distinct element model, by allowing for a micro-mechanical, distinct particle based, and cyclic description of fragmentation at the same time, without losing significant computational efficiency. Theses experiments indicate rock mass and boundary conditions, which allow an alternating fragmenting and dilating dispersive regime to evolve and to be sustained long enough to replicate the spreading and run out of sturzstroms. The fragmenting spreading model supported here is able to explain the run out of a dry granular flow, beyond the travel distance predicted by a Coulomb frictional sliding model, without resorting to explanations by mechanics that can only be valid for certain, specific of the boundary conditions. The implications derived suggest that a sturzstrom, because of its strong relation to internal fractal fragmentation and other inertial effects, constitutes a landslide category of its own. Its mechanics differ significantly from all other gravity driven mass flows. This proposition does not exclude the possible appearance of frictionites, Toma hills or suspension flows etc., but it considers them as secondary features. The application of a fractal comminution model to describe natural and experimental sturzstrom deposits turned out to be a useful tool for sturzstrom research. Implemented within the DEM, it allows simulating the key features of sturzstrom successfully and

  3. Mobile Portal Implementation Strategy

    DEFF Research Database (Denmark)

    Gao, Ping; Damsgaard, Jan

    2005-01-01

    Mobile portal plays an important role in mobile commerce market. Current literature focuses on static analysis on the value chain of mobile portals. This article provides a dynamic perspective on mobile portal strategy. Drawing upon network economics, we describe mobile portal implementation...... as a fourphase process. In different phase, a portal provider has various challenges to overcome and adopt diverse strategies, and correspondingly the regulator has different foci. The conceptual framework proposed in this article offers a basis for further analyses on the market dynamics of mobile commerce......, and can be generalized to studying other networked technologies...

  4. A Java Bytecode Metamodel for Composable Program Analyses

    NARCIS (Netherlands)

    Yildiz, Bugra Mehmet; Bockisch, Christoph; Rensink, Arend; Aksit, Mehmet; Seidl, Martina; Zschaler, Steffen

    Program analyses are an important tool to check if a system fulfills its specification. A typical implementation strategy for program analyses is to use an imperative, general-purpose language like Java; and access the program to be analyzed through libraries for manipulating intermediate code, such

  5. Implementing PAT with Standards

    Science.gov (United States)

    Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.

    2016-02-01

    Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.

  6. Cost/benefit analyses of environmental impact

    International Nuclear Information System (INIS)

    Goldman, M.I.

    1974-01-01

    Various aspects of cost-benefit analyses are considered. Some topics discussed are: regulations of the National Environmental Policy Act (NEPA); statement of AEC policy and procedures for implementation of NEPA; Calvert Cliffs decision; AEC Regulatory Guide; application of risk-benefit analysis to nuclear power; application of the as low as practicable (ALAP) rule to radiation discharges; thermal discharge restrictions proposed by EPA under the 1972 Amendment to the Water Pollution Control Act; estimates of somatic and genetic insult per unit population exposure; occupational exposure; EPA Point Source Guidelines for Discharges from Steam Electric Power Plants; and costs of closed-cycle cooling using cooling towers. (U.S.)

  7. Implementing Genome-Driven Oncology

    Science.gov (United States)

    Hyman, David M.; Taylor, Barry S.; Baselga, José

    2017-01-01

    Early successes in identifying and targeting individual oncogenic drivers, together with the increasing feasibility of sequencing tumor genomes, have brought forth the promise of genome-driven oncology care. As we expand the breadth and depth of genomic analyses, the biological and clinical complexity of its implementation will be unparalleled. Challenges include target credentialing and validation, implementing drug combinations, clinical trial designs, targeting tumor heterogeneity, and deploying technologies beyond DNA sequencing, among others. We review how contemporary approaches are tackling these challenges and will ultimately serve as an engine for biological discovery and increase our insight into cancer and its treatment. PMID:28187282

  8. InGaN/AlGaInN-based ultraviolet light-emitting diodes with indium gallium tin oxide electrodes

    International Nuclear Information System (INIS)

    Kim, Sukwon; Kim, Tae Geun

    2015-01-01

    In this study, In- and Sn-doped GaO (IGTO) is proposed as an alternative transparent conductive electrode for indium tin oxide (ITO) to improve the performance of InGaN/AlGaInN-based near ultraviolet light-emitting diodes (NUV LEDs). IGTO films were prepared by co-sputtering the ITO and Ga_2O_3 targets under various target power ratios. Among those, IGTO films post-annealed at 700 °C under a hydrogen environment gave rise to a transmittance of 94% at 385 nm and a contact resistance of 9.4 × 10"−"3 Ω-cm"2 with a sheet resistance of 124 Ω/ϒ. Compared to ITO-based NUV LEDs, the IGTO-based NUV LED showed a 9% improvement in the light output power, probably due to IGTO's higher transmittance, although the forward voltage was still higher by 0.23 V. - Highlights: • Indium gallium tin oxide (IGTO) for near-ultraviolet light-emitting diode is proposed. • IGTO is fabricated by co-sputtering the ITO and Ga_2O_3 targets and hydrogen annealing. • IGTO shows a 94% transmittance at 385 nm and a 9.4 × 10"−"3 Ω-cm"2 contact resistance. • Near-ultraviolet light-emitting diode with IGTO shows improved optical performance.

  9. InGaN/AlGaInN-based ultraviolet light-emitting diodes with indium gallium tin oxide electrodes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sukwon; Kim, Tae Geun, E-mail: tgkim1@korea.ac.kr

    2015-09-30

    In this study, In- and Sn-doped GaO (IGTO) is proposed as an alternative transparent conductive electrode for indium tin oxide (ITO) to improve the performance of InGaN/AlGaInN-based near ultraviolet light-emitting diodes (NUV LEDs). IGTO films were prepared by co-sputtering the ITO and Ga{sub 2}O{sub 3} targets under various target power ratios. Among those, IGTO films post-annealed at 700 °C under a hydrogen environment gave rise to a transmittance of 94% at 385 nm and a contact resistance of 9.4 × 10{sup −3} Ω-cm{sup 2} with a sheet resistance of 124 Ω/ϒ. Compared to ITO-based NUV LEDs, the IGTO-based NUV LED showed a 9% improvement in the light output power, probably due to IGTO's higher transmittance, although the forward voltage was still higher by 0.23 V. - Highlights: • Indium gallium tin oxide (IGTO) for near-ultraviolet light-emitting diode is proposed. • IGTO is fabricated by co-sputtering the ITO and Ga{sub 2}O{sub 3} targets and hydrogen annealing. • IGTO shows a 94% transmittance at 385 nm and a 9.4 × 10{sup −3} Ω-cm{sup 2} contact resistance. • Near-ultraviolet light-emitting diode with IGTO shows improved optical performance.

  10. BWR core melt progression phenomena: Experimental analyses

    International Nuclear Information System (INIS)

    Ott, L.J.

    1992-01-01

    In the BWR Core Melt in Progression Phenomena Program, experimental results concerning severe fuel damage and core melt progression in BWR core geometry are used to evaluate existing models of the governing phenomena. These include control blade eutectic liquefaction and the subsequent relocation and attack on the channel box structure; oxidation heating and hydrogen generation; Zircaloy melting and relocation; and the continuing oxidation of zirconium with metallic blockage formation. Integral data have been obtained from the BWR DF-4 experiment in the ACRR and from BWR tests in the German CORA exreactor fuel-damage test facility. Additional integral data will be obtained from new CORA BWR test, the full-length FLHT-6 BWR test in the NRU test reactor, and the new program of exreactor experiments at Sandia National Laboratories (SNL) on metallic melt relocation and blockage formation. an essential part of this activity is interpretation and use of the results of the BWR tests. The Oak Ridge National Laboratory (ORNL) has developed experiment-specific models for analysis of the BWR experiments; to date, these models have permitted far more precise analyses of the conditions in these experiments than has previously been available. These analyses have provided a basis for more accurate interpretation of the phenomena that the experiments are intended to investigate. The results of posttest analyses of BWR experiments are discussed and significant findings from these analyses are explained. The ORNL control blade/canister models with materials interaction, relocation and blockage models are currently being implemented in SCDAP/RELAP5 as an optional structural component

  11. Sample preparation in foodomic analyses.

    Science.gov (United States)

    Martinović, Tamara; Šrajer Gajdošik, Martina; Josić, Djuro

    2018-04-16

    Representative sampling and adequate sample preparation are key factors for successful performance of further steps in foodomic analyses, as well as for correct data interpretation. Incorrect sampling and improper sample preparation can be sources of severe bias in foodomic analyses. It is well known that both wrong sampling and sample treatment cannot be corrected anymore. These, in the past frequently neglected facts, are now taken into consideration, and the progress in sampling and sample preparation in foodomics is reviewed here. We report the use of highly sophisticated instruments for both high-performance and high-throughput analyses, as well as miniaturization and the use of laboratory robotics in metabolomics, proteomics, peptidomics and genomics. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. Critical factors for EIA implementation

    DEFF Research Database (Denmark)

    Zhang, Jasmine; Kørnøv, Lone; Christensen, Per

    2013-01-01

    After decades of development, the gap between expectations of Environment Impact Assessments (EIA) and their practical performance remains significant. Research has been done to identify the critical factors for an effective implementation of EIA. However, this research, to a large extent, has...... not been cumulated and analysed comprehensively according to the stages of the EIA process. This paper contributes to the critical review of the literature on EIA implementation and effectiveness by cumulating mainly empirical findings in an implementation theoretical perspective. It focuses on the links...... between different critical factors and how they relate to different stages in the EIA and thus influence the decision making process. After reviewing 33 refereed journal articles published between 1999 and 2011, we identified 203 notions of critical factors. Of these, 102 related to different stages...

  13. Descriptive Analyses of Mechanical Systems

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup; Hansen, Claus Thorp

    2003-01-01

    Forord Produktanalyse og teknologianalyse kan gennmføres med et bredt socio-teknisk sigte med henblik på at forstå kulturelle, sociologiske, designmæssige, forretningsmæssige og mange andre forhold. Et delområde heri er systemisk analyse og beskrivelse af produkter og systemer. Nærværende kompend...

  14. Analysing and Comparing Encodability Criteria

    Directory of Open Access Journals (Sweden)

    Kirstin Peters

    2015-08-01

    Full Text Available Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes.

  15. Analysing Children's Drawings: Applied Imagination

    Science.gov (United States)

    Bland, Derek

    2012-01-01

    This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…

  16. Impact analyses after pipe rupture

    International Nuclear Information System (INIS)

    Chun, R.C.; Chuang, T.Y.

    1983-01-01

    Two of the French pipe whip experiments are reproduced with the computer code WIPS. The WIPS results are in good agreement with the experimental data and the French computer code TEDEL. This justifies the use of its pipe element in conjunction with its U-bar element in a simplified method of impact analyses

  17. Millifluidic droplet analyser for microbiology

    NARCIS (Netherlands)

    Baraban, L.; Bertholle, F.; Salverda, M.L.M.; Bremond, N.; Panizza, P.; Baudry, J.; Visser, de J.A.G.M.; Bibette, J.

    2011-01-01

    We present a novel millifluidic droplet analyser (MDA) for precisely monitoring the dynamics of microbial populations over multiple generations in numerous (=103) aqueous emulsion droplets (100 nL). As a first application, we measure the growth rate of a bacterial strain and determine the minimal

  18. Analyser of sweeping electron beam

    International Nuclear Information System (INIS)

    Strasser, A.

    1993-01-01

    The electron beam analyser has an array of conductors that can be positioned in the field of the sweeping beam, an electronic signal treatment system for the analysis of the signals generated in the conductors by the incident electrons and a display for the different characteristics of the electron beam

  19. Global post-Kyoto scenario analyses at PSI

    Energy Technology Data Exchange (ETDEWEB)

    Kypreos, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs.

  20. Global post-Kyoto scenario analyses at PSI

    International Nuclear Information System (INIS)

    Kypreos, S.

    1999-01-01

    Scenario analyses are described here using the Global MARKAL-Macro Trade (GMMT) model to study the economic implications of the Kyoto Protocol to the UN Convention on Climate change. Some conclusions are derived in terms of efficient implementations of the post-Kyoto extensions of the Protocol. (author) 2 figs., 5 refs

  1. Implementation and de-implementation: two sides of the same coin?

    Science.gov (United States)

    van Bodegom-Vos, Leti; Davidoff, Frank; Marang-van de Mheen, Perla J

    2017-06-01

    Avoiding low value care received increasing attention in many countries, as with the Choosing Wisely campaign and other initiatives to abandon care that wastes resources or delivers no benefit to patients. While an extensive literature characterises approaches to implementing evidence-based care, we have limited understanding of the process of de-implementation, such as abandoning existing low value practices. To learn more about the differences between implementation and de-implementation, we explored the literature and analysed data from two published studies (one implementation and one de-implementation) by the same orthopaedic surgeons. We defined 'leaders' as those orthopaedic surgeons who implemented, or de-implemented, the target processes of care and laggards as those who did not. Our findings suggest that leaders in implementation share some characteristics with leaders in de-implementation when comparing them with laggards, such as more open to new evidence, younger and less time in clinical practice. However, leaders in de-implementation and implementation differed in some other characteristics and were not the same persons. Thus, leading in implementation or de-implementation may depend to some degree on the type of intervention rather than entirely reflecting personal characteristics. De-implementation seemed to be hampered by motivational factors such as department priorities, and economic and political factors such as cost-benefit considerations in care delivery, whereas organisational factors were associated only with implementation. The only barrier or facilitator common to both implementation and de-implementation consisted of outcome expectancy (ie, the perceived net benefit to patients). Future studies need to test the hypotheses generated from this study and improve our understanding of differences between the processes of implementation and de-implementation in the people who are most likely to lead (or resist) these efforts. Published by the

  2. Implementing a Capital Plan.

    Science.gov (United States)

    Daigneau, William A.

    2003-01-01

    Addresses four questions regarding implementation of a long-term capital plan to manage a college's facilities portfolio: When should the projects be implemented? How should the capital improvements be implemented? What will it actually cost in terms of project costs as well as operating costs? Who will implement the plan? (EV)

  3. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  4. Workload analyse of assembling process

    Science.gov (United States)

    Ghenghea, L. D.

    2015-11-01

    The workload is the most important indicator for managers responsible of industrial technological processes no matter if these are automated, mechanized or simply manual in each case, machines or workers will be in the focus of workload measurements. The paper deals with workload analyses made to a most part manual assembling technology for roller bearings assembling process, executed in a big company, with integrated bearings manufacturing processes. In this analyses the delay sample technique have been used to identify and divide all bearing assemblers activities, to get information about time parts from 480 minutes day work time that workers allow to each activity. The developed study shows some ways to increase the process productivity without supplementary investments and also indicated the process automation could be the solution to gain maximum productivity.

  5. Mitogenomic analyses from ancient DNA

    DEFF Research Database (Denmark)

    Paijmans, Johanna L. A.; Gilbert, Tom; Hofreiter, Michael

    2013-01-01

    The analysis of ancient DNA is playing an increasingly important role in conservation genetic, phylogenetic and population genetic analyses, as it allows incorporating extinct species into DNA sequence trees and adds time depth to population genetics studies. For many years, these types of DNA...... analyses (whether using modern or ancient DNA) were largely restricted to the analysis of short fragments of the mitochondrial genome. However, due to many technological advances during the past decade, a growing number of studies have explored the power of complete mitochondrial genome sequences...... yielded major progress with regard to both the phylogenetic positions of extinct species, as well as resolving population genetics questions in both extinct and extant species....

  6. Recriticality analyses for CAPRA cores

    International Nuclear Information System (INIS)

    Maschek, W.; Thiem, D.

    1995-01-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  7. Recriticality analyses for CAPRA cores

    Energy Technology Data Exchange (ETDEWEB)

    Maschek, W.; Thiem, D.

    1995-08-01

    The first scoping calculation performed show that the energetics levels from recriticalities in CAPRA cores are in the same range as in conventional cores. However, considerable uncertainties exist and further analyses are necessary. Additional investigations are performed for the separation scenarios of fuel/steel/inert and matrix material as a large influence of these processes on possible ramp rates and kinetics parameters was detected in the calculations. (orig./HP)

  8. Technical center for transportation analyses

    International Nuclear Information System (INIS)

    Foley, J.T.

    1978-01-01

    A description is presented of an information search/retrieval/research activity of Sandia Laboratories which provides technical environmental information which may be used in transportation risk analyses, environmental impact statements, development of design and test criteria for packaging of energy materials, and transportation mode research studies. General activities described are: (1) history of center development; (2) environmental information storage/retrieval system; (3) information searches; (4) data needs identification; and (5) field data acquisition system and applications

  9. Methodology of cost benefit analyses

    International Nuclear Information System (INIS)

    Patrik, M.; Babic, P.

    2000-10-01

    The report addresses financial aspects of proposed investments and other steps which are intended to contribute to nuclear safety. The aim is to provide introductory insight into the procedures and potential of cost-benefit analyses as a routine guide when making decisions on costly provisions as one of the tools to assess whether a particular provision is reasonable. The topic is applied to the nuclear power sector. (P.A.)

  10. Implementing Target Value Design.

    Science.gov (United States)

    Alves, Thais da C L; Lichtig, Will; Rybkowski, Zofia K

    2017-04-01

    An alternative to the traditional way of designing projects is the process of target value design (TVD), which takes different departure points to start the design process. The TVD process starts with the client defining an allowable cost that needs to be met by the design and construction teams. An expected cost in the TVD process is defined through multiple interactions between multiple stakeholders who define wishes and others who define ways of achieving these wishes. Finally, a target cost is defined based on the expected profit the design and construction teams are expecting to make. TVD follows a series of continuous improvement efforts aimed at reaching the desired goals for the project and its associated target value cost. The process takes advantage of rapid cycles of suggestions, analyses, and implementation that starts with the definition of value for the client. In the traditional design process, the goal is to identify user preferences and find solutions that meet the needs of the client's expressed preferences. In the lean design process, the goal is to educate users about their values and advocate for a better facility over the long run; this way owners can help contractors and designers to identify better solutions. This article aims to inform the healthcare community about tools and techniques commonly used during the TVD process and how they can be used to educate and support project participants in developing better solutions to meet their needs now as well as in the future.

  11. Analysing the Wrongness of Killing

    DEFF Research Database (Denmark)

    Di Nucci, Ezio

    2014-01-01

    This article provides an in-depth analysis of the wrongness of killing by comparing different versions of three influential views: the traditional view that killing is always wrong; the liberal view that killing is wrong if and only if the victim does not want to be killed; and Don Marquis‟ future...... of value account of the wrongness of killing. In particular, I illustrate the advantages that a basic version of the liberal view and a basic version of the future of value account have over competing alternatives. Still, ultimately none of the views analysed here are satisfactory; but the different...

  12. Methodological challenges in carbohydrate analyses

    Directory of Open Access Journals (Sweden)

    Mary Beth Hall

    2007-07-01

    Full Text Available Carbohydrates can provide up to 80% of the dry matter in animal diets, yet their specific evaluation for research and diet formulation is only now becoming a focus in the animal sciences. Partitioning of dietary carbohydrates for nutritional purposes should reflect differences in digestion and fermentation characteristics and effects on animal performance. Key challenges to designating nutritionally important carbohydrate fractions include classifying the carbohydrates in terms of nutritional characteristics, and selecting analytical methods that describe the desired fraction. The relative lack of information on digestion characteristics of various carbohydrates and their interactions with other fractions in diets means that fractions will not soon be perfectly established. Developing a system of carbohydrate analysis that could be used across animal species could enhance the utility of analyses and amount of data we can obtain on dietary effects of carbohydrates. Based on quantities present in diets and apparent effects on animal performance, some nutritionally important classes of carbohydrates that may be valuable to measure include sugars, starch, fructans, insoluble fiber, and soluble fiber. Essential to selection of methods for these fractions is agreement on precisely what carbohydrates should be included in each. Each of these fractions has analyses that could potentially be used to measure them, but most of the available methods have weaknesses that must be evaluated to see if they are fatal and the assay is unusable, or if the assay still may be made workable. Factors we must consider as we seek to analyze carbohydrates to describe diets: Does the assay accurately measure the desired fraction? Is the assay for research, regulatory, or field use (affects considerations of acceptable costs and throughput? What are acceptable accuracy and variability of measures? Is the assay robust (enhances accuracy of values? For some carbohydrates, we

  13. Theorising and Analysing Academic Labour

    Directory of Open Access Journals (Sweden)

    Thomas Allmer

    2018-01-01

    Full Text Available The aim of this article is to contextualise universities historically within capitalism and to analyse academic labour and the deployment of digital media theoretically and critically. It argues that the post-war expansion of the university can be considered as medium and outcome of informational capitalism and as a dialectical development of social achievement and advanced commodification. The article strives to identify the class position of academic workers, introduces the distinction between academic work and labour, discusses the connection between academic, information and cultural work, and suggests a broad definition of university labour. It presents a theoretical model of working conditions that helps to systematically analyse the academic labour process and to provide an overview of working conditions at universities. The paper furthermore argues for the need to consider the development of education technologies as a dialectics of continuity and discontinuity, discusses the changing nature of the forces and relations of production, and the impact on the working conditions of academics in the digital university. Based on Erik Olin Wright’s inclusive approach of social transformation, the article concludes with the need to bring together anarchist, social democratic and revolutionary strategies for establishing a socialist university in a commons-based information society.

  14. CFD analyses in regulatory practice

    International Nuclear Information System (INIS)

    Bloemeling, F.; Pandazis, P.; Schaffrath, A.

    2012-01-01

    Numerical software is used in nuclear regulatory procedures for many problems in the fields of neutron physics, structural mechanics, thermal hydraulics etc. Among other things, the software is employed in dimensioning and designing systems and components and in simulating transients and accidents. In nuclear technology, analyses of this kind must meet strict requirements. Computational Fluid Dynamics (CFD) codes were developed for computing multidimensional flow processes of the type occurring in reactor cooling systems or in containments. Extensive experience has been accumulated by now in selected single-phase flow phenomena. At the present time, there is a need for development and validation with respect to the simulation of multi-phase and multi-component flows. As insufficient input by the user can lead to faulty results, the validity of the results and an assessment of uncertainties are guaranteed only through consistent application of so-called Best Practice Guidelines. The authors present the possibilities now available to CFD analyses in nuclear regulatory practice. This includes a discussion of the fundamental requirements to be met by numerical software, especially the demands upon computational analysis made by nuclear rules and regulations. In conclusion, 2 examples are presented of applications of CFD analysis to nuclear problems: Determining deboration in the condenser reflux mode of operation, and protection of the reactor pressure vessel (RPV) against brittle failure. (orig.)

  15. TECHNOLOGICAL IMPLEMENTATION PLAN

    DEFF Research Database (Denmark)

    Bellini, Anna

    2004-01-01

    This document has the purpose to describe the technological implementation plan in the IDEAL project.......This document has the purpose to describe the technological implementation plan in the IDEAL project....

  16. Technology Implementation Plan

    DEFF Research Database (Denmark)

    Jensen, Karsten Ingerslev; Schultz, Jørgen Munthe

    The Technology Implementation Plan (TIP) describes the main project results and the intended future use. The TIP is confidential.......The Technology Implementation Plan (TIP) describes the main project results and the intended future use. The TIP is confidential....

  17. Implementing Student Information Systems

    Science.gov (United States)

    Sullivan, Laurie; Porter, Rebecca

    2006-01-01

    Implementing an enterprise resource planning system is a complex undertaking. Careful planning, management, communication, and staffing can make the difference between a successful and unsuccessful implementation. (Contains 3 tables.)

  18. Ecodesign Implementation and LCA

    DEFF Research Database (Denmark)

    McAloone, Tim C.; Pigosso, Daniela Cristina Antelmi

    2018-01-01

    implementation into manufacturing companies. Existing methods and tools for ecodesign implementation will be described, focusing on a multifaceted approach to environmental improvement through product development. Additionally, the use of LCA in an ecodesign implementation context will be further described...... in terms of the challenges and opportunities, together with the discussion of a selection of simplified LCA tools. Finally, a seven-step approach for ecodesign implementation which has been applied by several companies will be described....

  19. Severe accident recriticality analyses (SARA)

    DEFF Research Database (Denmark)

    Frid, W.; Højerup, C.F.; Lindholm, I.

    2001-01-01

    with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality-both super-prompt power bursts and quasi steady-state power......Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies......, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g(-1), was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s(-1). In most cases, however, the predicted energy deposition was smaller, below...

  20. Hydrogen Analyses in the EPR

    International Nuclear Information System (INIS)

    Worapittayaporn, S.; Eyink, J.; Movahed, M.

    2008-01-01

    In severe accidents with core melting large amounts of hydrogen may be released into the containment. The EPR provides a combustible gas control system to prevent hydrogen combustion modes with the potential to challenge the containment integrity due to excessive pressure and temperature loads. This paper outlines the approach for the verification of the effectiveness and efficiency of this system. Specifically, the justification is a multi-step approach. It involves the deployment of integral codes, lumped parameter containment codes and CFD codes and the use of the sigma criterion, which provides the link to the broad experimental data base for flame acceleration (FA) and deflagration to detonation transition (DDT). The procedure is illustrated with an example. The performed analyses show that hydrogen combustion at any time does not lead to pressure or temperature loads that threaten the containment integrity of the EPR. (authors)

  1. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  2. The hemispherical deflector analyser revisited

    Energy Technology Data Exchange (ETDEWEB)

    Benis, E.P. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece)], E-mail: benis@iesl.forth.gr; Zouros, T.J.M. [Institute of Electronic Structure and Laser, P.O. Box 1385, 71110 Heraklion, Crete (Greece); Department of Physics, University of Crete, P.O. Box 2208, 71003 Heraklion, Crete (Greece)

    2008-04-15

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R{sub 0} and the nominal value of the potential V(R{sub 0}) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD.

  3. The hemispherical deflector analyser revisited

    International Nuclear Information System (INIS)

    Benis, E.P.; Zouros, T.J.M.

    2008-01-01

    Using the basic spectrometer trajectory equation for motion in an ideal 1/r potential derived in Eq. (101) of part I [T.J.M. Zouros, E.P. Benis, J. Electron Spectrosc. Relat. Phenom. 125 (2002) 221], the operational characteristics of a hemispherical deflector analyser (HDA) such as dispersion, energy resolution, energy calibration, input lens magnification and energy acceptance window are investigated from first principles. These characteristics are studied as a function of the entry point R 0 and the nominal value of the potential V(R 0 ) at entry. Electron-optics simulations and actual laboratory measurements are compared to our theoretical results for an ideal biased paracentric HDA using a four-element zoom lens and a two-dimensional position sensitive detector (2D-PSD). These results should be of particular interest to users of modern HDAs utilizing a PSD

  4. Analysing performance through value creation

    Directory of Open Access Journals (Sweden)

    Adrian TRIFAN

    2015-12-01

    Full Text Available This paper draws a parallel between measuring financial performance in 2 variants: the first one using data offered by accounting, which lays emphasis on maximizing profit, and the second one which aims to create value. The traditional approach to performance is based on some indicators from accounting data: ROI, ROE, EPS. The traditional management, based on analysing the data from accounting, has shown its limits, and a new approach is needed, based on creating value. The evaluation of value based performance tries to avoid the errors due to accounting data, by using other specific indicators: EVA, MVA, TSR, CVA. The main objective is shifted from maximizing the income to maximizing the value created for shareholders. The theoretical part is accompanied by a practical analysis regarding the creation of value and an analysis of the main indicators which evaluate this concept.

  5. Implementing Replacement Cost Accounting

    Science.gov (United States)

    1976-12-01

    cost accounting Clickener, John Ross Monterey, California. Naval Postgraduate School http://hdl.handle.net/10945/17810 Downloaded from NPS Archive...Calhoun IMPLEMENTING REPLACEMENT COST ACCOUNTING John Ross CHckener NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS IMPLEMENTING REPLACEMENT COST ...Implementing Replacement Cost Accounting 7. AUTHORS John Ross Clickener READ INSTRUCTIONS BEFORE COMPLETING FORM 3. RECIPIENT’S CATALOG NUMBER 9. TYRE OF

  6. [The maintenance of automatic analysers and associated documentation].

    Science.gov (United States)

    Adjidé, V; Fournier, P; Vassault, A

    2010-12-01

    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  7. Implementering & Performative Potentialer

    DEFF Research Database (Denmark)

    Damkjer, Annemarie

    organizational boundaries. Furthermore, the analysis reflect how specific modes of ordering in the local process of implementation perform and displace the technology of performance management. The study provides an alternative view of the performative potentials in implementation processes and specifically...... challenge the traditional models of implementation. It is suggested, that we view implementation practices as performative in relation to the co-configuration of technology and organizational practices and that both the co-configurative perspective and the materiality of implementation practices is included......This thesis investigates how technology is constituted as an object of implementation. Using the theoretical lens of actor-network theory the thesis investigates how the technology of performance management becomes a matter of implementation in the Danish Defence and how the technology...

  8. Proteins analysed as virtual knots

    Science.gov (United States)

    Alexander, Keith; Taylor, Alexander J.; Dennis, Mark R.

    2017-02-01

    Long, flexible physical filaments are naturally tangled and knotted, from macroscopic string down to long-chain molecules. The existence of knotting in a filament naturally affects its configuration and properties, and may be very stable or disappear rapidly under manipulation and interaction. Knotting has been previously identified in protein backbone chains, for which these mechanical constraints are of fundamental importance to their molecular functionality, despite their being open curves in which the knots are not mathematically well defined; knotting can only be identified by closing the termini of the chain somehow. We introduce a new method for resolving knotting in open curves using virtual knots, which are a wider class of topological objects that do not require a classical closure and so naturally capture the topological ambiguity inherent in open curves. We describe the results of analysing proteins in the Protein Data Bank by this new scheme, recovering and extending previous knotting results, and identifying topological interest in some new cases. The statistics of virtual knots in protein chains are compared with those of open random walks and Hamiltonian subchains on cubic lattices, identifying a regime of open curves in which the virtual knotting description is likely to be important.

  9. Digital image analyser for autoradiography

    International Nuclear Information System (INIS)

    Muth, R.A.; Plotnick, J.

    1985-01-01

    The most critical parameter in quantitative autoradiography for assay of tissue concentrations of tracers is the ability to obtain precise and accurate measurements of optical density of the images. Existing high precision systems for image analysis, rotating drum densitometers, are expensive, suffer from mechanical problems and are slow. More moderately priced and reliable video camera based systems are available, but their outputs generally do not have the uniformity and stability necessary for high resolution quantitative autoradiography. The authors have designed and constructed an image analyser optimized for quantitative single and multiple tracer autoradiography which the authors refer to as a memory-mapped charged-coupled device scanner (MM-CCD). The input is from a linear array of CCD's which is used to optically scan the autoradiograph. Images are digitized into 512 x 512 picture elements with 256 gray levels and the data is stored in buffer video memory in less than two seconds. Images can then be transferred to RAM memory by direct memory-mapping for further processing. Arterial blood curve data and optical density-calibrated standards data can be entered and the optical density images can be converted automatically to tracer concentration or functional images. In double tracer studies, images produced from both exposures can be stored and processed in RAM to yield ''pure'' individual tracer concentration or functional images. Any processed image can be transmitted back to the buffer memory to be viewed on a monitor and processed for region of interest analysis

  10. Multichannel amplitude analyser for nuclear spectrometry

    International Nuclear Information System (INIS)

    Jankovic, S.; Milovanovic, B.

    2003-01-01

    A multichannel amplitude analyser with 4096 channels was designed. It is based on a fast 12-bit analog-to-digital converter. The intended purpose of the instrument is recording nuclear spectra by means of scintillation detectors. The computer link is established through an opto-isolated serial connection cable, thus reducing instrument sensitivity to disturbances originating from digital circuitry. Refreshing of the data displayed on the screen occurs on every 2.5 seconds. The impulse peak detection is implemented through the differentiation of the amplified input signal, while the synchronization with the data coming from the converter output is established by taking advantage of the internal 'pipeline' structure of the converter itself. The mode of operation of the built-in microcontroller provides that there are no missed impulses, and the simple logic network prevents the initiation of the amplitude reading sequence for the next impulse in case it appears shortly after its precedent. The solution proposed here demonstrated a good performance at a comparatively low manufacturing cost, and is thus suitable for educational purposes (author)

  11. Interim Basis for PCB Sampling and Analyses

    International Nuclear Information System (INIS)

    BANNING, D.L.

    2001-01-01

    This document was developed as an interim basis for sampling and analysis of polychlorinated biphenyls (PCBs) and will be used until a formal data quality objective (DQO) document is prepared and approved. On August 31, 2000, the Framework Agreement for Management of Polychlorinated Biphenyls (PCBs) in Hanford Tank Waste was signed by the US. Department of Energy (DOE), the Environmental Protection Agency (EPA), and the Washington State Department of Ecology (Ecology) (Ecology et al. 2000). This agreement outlines the management of double shell tank (DST) waste as Toxic Substance Control Act (TSCA) PCB remediation waste based on a risk-based disposal approval option per Title 40 of the Code of Federal Regulations 761.61 (c). The agreement calls for ''Quantification of PCBs in DSTs, single shell tanks (SSTs), and incoming waste to ensure that the vitrification plant and other ancillary facilities PCB waste acceptance limits and the requirements of the anticipated risk-based disposal approval are met.'' Waste samples will be analyzed for PCBs to satisfy this requirement. This document describes the DQO process undertaken to assure appropriate data will be collected to support management of PCBs and is presented in a DQO format. The DQO process was implemented in accordance with the U.S. Environmental Protection Agency EPA QAlG4, Guidance for the Data Quality Objectives Process (EPA 1994) and the Data Quality Objectives for Sampling and Analyses, HNF-IP-0842/Rev.1 A, Vol. IV, Section 4.16 (Banning 1999)

  12. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  13. Severe Accident Recriticality Analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hoejerup, F. [Risoe National Lab. (Denmark); Lindholm, I.; Miettinen, J.; Puska, E.K. [VTT Energy, Helsinki (Finland); Nilsson, Lars [Studsvik Eco and Safety AB, Nykoeping (Sweden); Sjoevall, H. [Teoliisuuden Voima Oy (Finland)

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B{sub 4}C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  14. Severe accident recriticality analyses (SARA)

    Energy Technology Data Exchange (ETDEWEB)

    Frid, W. E-mail: wiktor.frid@ski.se; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H

    2001-11-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s{sup -1} injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g{sup -1}, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s{sup -1}. In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated

  15. Severe accident recriticality analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Nilsson, L.; Puska, E.K.; Sjoevall, H.

    2001-01-01

    Recriticality in a BWR during reflooding of an overheated partly degraded core, i.e. with relocated control rods, has been studied for a total loss of electric power accident scenario. In order to assess the impact of recriticality on reactor safety, including accident management strategies, the following issues have been investigated in the SARA project: (1) the energy deposition in the fuel during super-prompt power burst; (2) the quasi steady-state reactor power following the initial power burst; and (3) containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality--both super-prompt power bursts and quasi steady-state power generation--for the range of parameters studied, i.e. with core uncovering and heat-up to maximum core temperatures of approximately 1800 K, and water flow rates of 45-2000 kg s -1 injected into the downcomer. Since recriticality takes place in a small fraction of the core, the power densities are high, which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal g -1 , was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding rate of 2000 kg s -1 . In most cases, however, the predicted energy deposition was smaller, below the regulatory limits for fuel failure, but close to or above recently observed thresholds for fragmentation and dispersion of high burn-up fuel. The highest calculated quasi steady

  16. Severe Accident Recriticality Analyses (SARA)

    International Nuclear Information System (INIS)

    Frid, W.; Hoejerup, F.; Lindholm, I.; Miettinen, J.; Puska, E.K.; Nilsson, Lars; Sjoevall, H.

    1999-11-01

    Recriticality in a BWR has been studied for a total loss of electric power accident scenario. In a BWR, the B 4 C control rods would melt and relocate from the core before the fuel during core uncovery and heat-up. If electric power returns during this time-window unborated water from ECCS systems will start to reflood the partly control rod free core. Recriticality might take place for which the only mitigating mechanisms are the Doppler effect and void formation. In order to assess the impact of recriticality on reactor safety, including accident management measures, the following issues have been investigated in the SARA project: 1. the energy deposition in the fuel during super-prompt power burst, 2. the quasi steady-state reactor power following the initial power burst and 3. containment response to elevated quasi steady-state reactor power. The approach was to use three computer codes and to further develop and adapt them for the task. The codes were SIMULATE-3K, APROS and RECRIT. Recriticality analyses were carried out for a number of selected reflooding transients for the Oskarshamn 3 plant in Sweden with SIMULATE-3K and for the Olkiluoto 1 plant in Finland with all three codes. The core state initial and boundary conditions prior to recriticality have been studied with the severe accident codes SCDAP/RELAP5, MELCOR and MAAP4. The results of the analyses show that all three codes predict recriticality - both superprompt power bursts and quasi steady-state power generation - for the studied range of parameters, i. e. with core uncovery and heat-up to maximum core temperatures around 1800 K and water flow rates of 45 kg/s to 2000 kg/s injected into the downcomer. Since the recriticality takes place in a small fraction of the core the power densities are high which results in large energy deposition in the fuel during power burst in some accident scenarios. The highest value, 418 cal/g, was obtained with SIMULATE-3K for an Oskarshamn 3 case with reflooding

  17. Institutional aspects of NAMA development and implementation

    DEFF Research Database (Denmark)

    Hinostroza, Miriam L.; Sharma, Sudhir; Karavai, Maryna

    This publication analyses how developing countries may arrange their institutional and organizational structures or enhance the existing ones in order to deal with these new developments under the international climate change mitigation regime. Focus is on how to ensure the implementation of NAMAs...

  18. MGtoolkit: A python package for implementing metagraphs

    Science.gov (United States)

    Ranathunga, D.; Nguyen, H.; Roughan, M.

    In this paper we present MGtoolkit: an open-source Python package for implementing metagraphs - a first of its kind. Metagraphs are commonly used to specify and analyse business and computer-network policies alike. MGtoolkit can help verify such policies and promotes learning and experimentation with metagraphs. The package currently provides purely textual output for visualising metagraphs and their analysis results.

  19. User participation in implementation

    DEFF Research Database (Denmark)

    Fleron, Benedicte; Rasmussen, Rasmus; Simonsen, Jesper

    2012-01-01

    Systems development has been claimed to benefit from user participation, yet user participation in implementation activities may be more common and is a growing focus of participatory-design work. We investigate the effect of the extensive user participation in the implementation of a clinical...... experienced more uncertainty and frustration than management and non-participating staff, especially concerning how to run an implementation process and how to understand and utilize the configuration possibilities of the system. This suggests that user participation in implementation introduces a need...

  20. Pawnee Nation Energy Option Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Matlock, M.; Kersey, K.; Riding In, C.

    2009-07-21

    Pawnee Nation of Oklahoma Energy Option Analyses In 2003, the Pawnee Nation leadership identified the need for the tribe to comprehensively address its energy issues. During a strategic energy planning workshop a general framework was laid out and the Pawnee Nation Energy Task Force was created to work toward further development of the tribe’s energy vision. The overarching goals of the “first steps” project were to identify the most appropriate focus for its strategic energy initiatives going forward, and to provide information necessary to take the next steps in pursuit of the “best fit” energy options. Description of Activities Performed The research team reviewed existing data pertaining to the availability of biomass (focusing on woody biomass, agricultural biomass/bio-energy crops, and methane capture), solar, wind and hydropower resources on the Pawnee-owned lands. Using these data, combined with assumptions about costs and revenue streams, the research team performed preliminary feasibility assessments for each resource category. The research team also reviewed available funding resources and made recommendations to Pawnee Nation highlighting those resources with the greatest potential for financially-viable development, both in the near-term and over a longer time horizon. Findings and Recommendations Due to a lack of financial incentives for renewable energy, particularly at the state level, combined mediocre renewable energy resources, renewable energy development opportunities are limited for Pawnee Nation. However, near-term potential exists for development of solar hot water at the gym, and an exterior wood-fired boiler system at the tribe’s main administrative building. Pawnee Nation should also explore options for developing LFGTE resources in collaboration with the City of Pawnee. Significant potential may also exist for development of bio-energy resources within the next decade. Pawnee Nation representatives should closely monitor

  1. Socioeconomic issues and analyses for radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Ulland, L.

    1988-01-01

    Radioactive Waste facility siting and development can raise major social and economic issues in the host area. Initial site screening and analyses have been conducted for both potential high-level and low-level radioactive waste facilities; more detailed characterization and analyses are being planned. Results of these assessments are key to developing community plans that identify and implement measures to mitigate adverse socioeconomic impacts. Preliminary impact analyses conducted at high-level sites in Texas and Nevada, and site screening activities for low-level facilities in Illinois and California have identified a number of common socioeconomic issues and characteristics as well as issues and characteristics that differ between the sites and the type of facilities. Based on these comparisons, implications for selection of an appropriate methodology for impact assessment and elements of impact mitigation are identified

  2. Implementation between Tradition and Management: Structuration and Styles of Implementation

    NARCIS (Netherlands)

    Terpstra, Jan; Havinga, Tetty

    2001-01-01

    This article presents a diachronic perspective for implementation research. It analyzes implementation practices in relation to their changing institutional context. Therefore, a comparison is made between different styles of implementation. The relationship between implementation practices and

  3. Environmental protection Implementation Plan

    International Nuclear Information System (INIS)

    Holland, R. C.

    1999-01-01

    This ''Environmental Protection Implementation Plan'' is intended to ensure that the environmental program objectives of Department of Energy Order 5400.1 are achieved at SNL/California. This document states SNL/California's commitment to conduct its operations in an environmentally safe and responsible manner. The ''Environmental Protection Implementation Plan'' helps management and staff comply with applicable environmental responsibilities

  4. Energy. Policy and Implementation

    International Nuclear Information System (INIS)

    Stroop, A.

    2006-01-01

    Why does the government have an energy policy? What form does it take? Who is involved in implementing that policy? These and similar questions are answered in the latest Energy Report. The Dutch Ministry of Economic Affairs (EZ) argues that the objectives are feasible as long as the energy policies are matched by suitable implementation measures [nl

  5. Determining the predictors of innovation implementation in healthcare: a quantitative analysis of implementation effectiveness.

    Science.gov (United States)

    Jacobs, Sara R; Weiner, Bryan J; Reeve, Bryce B; Hofmann, David A; Christian, Michael; Weinberger, Morris

    2015-01-22

    The failure rates for implementing complex innovations in healthcare organizations are high. Estimates range from 30% to 90% depending on the scope of the organizational change involved, the definition of failure, and the criteria to judge it. The innovation implementation framework offers a promising approach to examine the organizational factors that determine effective implementation. To date, the utility of this framework in a healthcare setting has been limited to qualitative studies and/or group level analyses. Therefore, the goal of this study was to quantitatively examine this framework among individual participants in the National Cancer Institute's Community Clinical Oncology Program using structural equation modeling. We examined the innovation implementation framework using structural equation modeling (SEM) among 481 physician participants in the National Cancer Institute's Community Clinical Oncology Program (CCOP). The data sources included the CCOP Annual Progress Reports, surveys of CCOP physician participants and administrators, and the American Medical Association Physician Masterfile. Overall the final model fit well. Our results demonstrated that not only did perceptions of implementation climate have a statistically significant direct effect on implementation effectiveness, but physicians' perceptions of implementation climate also mediated the relationship between organizational implementation policies and practices (IPP) and enrollment (p innovation implementation framework between IPP, implementation climate, and implementation effectiveness among individual physicians. This finding is important, as although the model has been discussed within healthcare organizations before, the studies have been predominately qualitative in nature and/or at the organizational level. In addition, our findings have practical applications. Managers looking to increase implementation effectiveness of an innovation should focus on creating an environment that

  6. Future Perspectives of the Implementation of EU Urban Agenda

    Directory of Open Access Journals (Sweden)

    Olejnik Aleksandra

    2017-06-01

    Full Text Available This article is an overview of opinions and recommendations adopted in the European Union vis-à-vis urban policy. The author analyses the Pact of Amsterdam and future perspectives of the implementation of EU Urban Agenda.

  7. What Do You Recommend? Implementation and Analyses of Collaborative Information Filtering of Web Resources for Education.

    Science.gov (United States)

    Recker, Mimi M.; Walker, Andrew; Lawless, Kimberly

    2003-01-01

    Examines results from one pilot study and two empirical studies of a collaborative filtering system applied in higher education settings. Explains the use of collaborative filtering in electronic commerce and suggests it can be adapted to education to help find useful Web resources and to bring people together with similar interests and beliefs.…

  8. An implementation of multiple multipole method in the analyse of elliptical objects to enhance backscattering light

    Science.gov (United States)

    Jalali, T.

    2015-07-01

    In this paper, we present dielectric elliptical shapes modelling with respect to a highly confined power distribution in the resulting nanojet, which has been parameterized according to the beam waist and its beam divergence. The method is based on spherical bessel function as a basis function, which is adapted to standard multiple multipole method. This method can handle elliptically shaped particles due to the change of size and refractive indices, which have been studied under plane wave illumination in two and three dimensional multiple multipole method. Because of its fast and good convergence, the results obtained from simulation are highly accurate and reliable. The simulation time is less than minute for two and three dimension. Therefore, the proposed method is found to be computationally efficient, fast and accurate.

  9. International Implementation of Best Practices for Mitigating Insider Threat: Analyses for India and Germany

    Science.gov (United States)

    2014-04-01

    Similar to some U.S. data breach law, the IT Act allows a negligent organization to be found liable for failing to take reasonable security...notifi- cation requirement for cybersecurity breaches. Data breach notification is already required for significant breaches of sensitive data...2011a] Hunton & Williams, LLP. “German DPAs Publish Comprehensive FAQs on Statutory Data Breach Notification Requirement.” Privacy and Information

  10. Implementation of particle analysers in the detection and description of biofilm formation

    Czech Academy of Sciences Publication Activity Database

    Kadlec, Robert; Plocková, Jana; Růžička, F.; Holá, V.

    2004-01-01

    Roč. 10, supl3 (2004), s. 102 ISSN 1198-743X. [14th ECCMID. European Congress of Clinical Microbiology and Infectious Diseases /14./. Praha, 01.05.2004-04.05.2004] Institutional research plan: CEZ:AV0Z4031919 Keywords : bacterial biofilm * particle size distribution * microparticle s Subject RIV: EE - Microbiology, Virology Impact factor: 2.361, year: 2004

  11. Improving word coverage using unsupervised morphological analyser

    Indian Academy of Sciences (India)

    To enable a computer to process information in human languages, ... vised morphological analyser (UMA) would learn how to analyse a language just by looking ... result for English, but they did remarkably worse for Finnish and Turkish.

  12. Techniques for Analysing Problems in Engineering Projects

    DEFF Research Database (Denmark)

    Thorsteinsson, Uffe

    1998-01-01

    Description of how CPM network can be used for analysing complex problems in engineering projects.......Description of how CPM network can be used for analysing complex problems in engineering projects....

  13. IMS IN SMES - REASONS, ADVANTAGES AND BARRIERS ON IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Dragan Rajković

    2008-09-01

    Full Text Available Appearance of a number of management systems with various and sometimes divergent demands, demands for revise of optimal strategy on implementation of these standards in small and medium-sized enterprises (SMEs and the attempt on their integration into integrated management system are suggested even more. Firstly question on choice and reasons for implementation of standards is raised. Management and employees expect benefits on the implementation and they pass and minimize the implementation barriers. Basic concept on integrated management system (IMS into SMEs and analyse on reasons, advantages and barriers at IMS implementation are presented in this paper.

  14. Implementing function spreadsheets

    DEFF Research Database (Denmark)

    Sestoft, Peter

    2008-01-01

    : that of turning an expression into a named function. Hence they proposed a way to define a function in terms of a worksheet with designated input and output cells; we shall call it a function sheet. The goal of our work is to develop implementations of function sheets and study their application to realistic...... examples. Therefore, we are also developing a simple yet comprehensive spreadsheet core implementation for experimentation with this technology. Here we report briefly on our experiments with function sheets as well as other uses of our spreadsheet core implementation....

  15. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic prog...

  16. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  17. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  18. Implementing OWL Defaults

    National Research Council Canada - National Science Library

    Kolovski, Vladimir; Parsia, Bijan; Katz, Yarden

    2006-01-01

    ...) have often requested some form of non-monotonic reasoning. In this paper, we present preliminary optimizations and an implementation of a restricted version of Reiter's default logic as an extension to the description logic fragment of OWL, OWL DL...

  19. Connecting Architecture and Implementation

    Science.gov (United States)

    Buchgeher, Georg; Weinreich, Rainer

    Software architectures are still typically defined and described independently from implementation. To avoid architectural erosion and drift, architectural representation needs to be continuously updated and synchronized with system implementation. Existing approaches for architecture representation like informal architecture documentation, UML diagrams, and Architecture Description Languages (ADLs) provide only limited support for connecting architecture descriptions and implementations. Architecture management tools like Lattix, SonarJ, and Sotoarc and UML-tools tackle this problem by extracting architecture information directly from code. This approach works for low-level architectural abstractions like classes and interfaces in object-oriented systems but fails to support architectural abstractions not found in programming languages. In this paper we present an approach for linking and continuously synchronizing a formalized architecture representation to an implementation. The approach is a synthesis of functionality provided by code-centric architecture management and UML tools and higher-level architecture analysis approaches like ADLs.

  20. TQM implementation for the healthcare sector.

    Science.gov (United States)

    Chiarini, Andrea; Vagnoni, Emidia

    2017-07-03

    Purpose The purpose of this paper is to enlarge the debate on total quality management (TQM) implementation in the healthcare sector and to evaluate how and whether leadership can affect TQM implementation. Design/methodology/approach This paper is based on findings from a literature review of TQM and leadership. The authors analysed these findings to categorise causes of a lack of leadership in TQM programme implementations. Findings The authors propose three categories of causes of a lack of leadership in TQM programme implementation. The first cause is well-known: a lack of senior managers' involvement and commitment. The second category is the "combined leadership" that occurs in large healthcare organisations; and the third category is the influence of an external "political leadership" on public healthcare. Research limitations/implications This paper presents researchers with three categories of causes of failure of leadership in TQM implementation that can be investigated. It also encourages reflections from practitioners concerning TQM leadership in the healthcare sector. Practical implications The authors request that practitioners reflect on ways to create or sustain a "monolithic" leadership, especially in large organisations, to ensure a common vision, values and attitude for unitary TQM governance. Originality/value In an original way, this paper analyses and proposes three categories of causes linked to a lack of TQM leadership in the healthcare sector.

  1. Data governance implementation concept

    OpenAIRE

    Ullrichová, Jana

    2016-01-01

    This master´s thesis discusses concept of implementation for data governance. The theoretical part of this thesis is about data governance. It explains why data are important for company, describes definitoons of data governance, its history, its components, its principles and processes and fitting in company. Theoretical part is amended with examples of data governance failures and banking specifics. The main goal of this thesis is to create a concept for implementing data governance and its...

  2. EDMS implementation challenge.

    Science.gov (United States)

    De La Torre, Marta

    2002-08-01

    The challenges faced by facilities wishing to implement an electronic medical record system are complex and overwhelming. Issues such as customer acceptance, basic computer skills, and a thorough understanding of how the new system will impact work processes must be considered and acted upon. Acceptance and active support are necessary from Senior Administration and key departments to enable this project to achieve measurable success. This article details one hospital's "journey" through design and successful implementation of an electronic medical record system.

  3. Altools: a user friendly NGS data analyser.

    Science.gov (United States)

    Camiolo, Salvatore; Sablok, Gaurav; Porceddu, Andrea

    2016-02-17

    Genotyping by re-sequencing has become a standard approach to estimate single nucleotide polymorphism (SNP) diversity, haplotype structure and the biodiversity and has been defined as an efficient approach to address geographical population genomics of several model species. To access core SNPs and insertion/deletion polymorphisms (indels), and to infer the phyletic patterns of speciation, most such approaches map short reads to the reference genome. Variant calling is important to establish patterns of genome-wide association studies (GWAS) for quantitative trait loci (QTLs), and to determine the population and haplotype structure based on SNPs, thus allowing content-dependent trait and evolutionary analysis. Several tools have been developed to investigate such polymorphisms as well as more complex genomic rearrangements such as copy number variations, presence/absence variations and large deletions. The programs available for this purpose have different strengths (e.g. accuracy, sensitivity and specificity) and weaknesses (e.g. low computation speed, complex installation procedure and absence of a user-friendly interface). Here we introduce Altools, a software package that is easy to install and use, which allows the precise detection of polymorphisms and structural variations. Altools uses the BWA/SAMtools/VarScan pipeline to call SNPs and indels, and the dnaCopy algorithm to achieve genome segmentation according to local coverage differences in order to identify copy number variations. It also uses insert size information from the alignment of paired-end reads and detects potential large deletions. A double mapping approach (BWA/BLASTn) identifies precise breakpoints while ensuring rapid elaboration. Finally, Altools implements several processes that yield deeper insight into the genes affected by the detected polymorphisms. Altools was used to analyse both simulated and real next-generation sequencing (NGS) data and performed satisfactorily in terms of

  4. Integrated Field Analyses of Thermal Springs

    Science.gov (United States)

    Shervais, K.; Young, B.; Ponce-Zepeda, M. M.; Rosove, S.

    2011-12-01

    A group of undergraduate researchers through the SURE internship offered by the Southern California Earthquake Center (SCEC) have examined thermal springs in southern Idaho, northern Utah as well as mud volcanoes in the Salton Sea, California. We used an integrated approach to estimate the setting and maximum temperature, including water chemistry, Ipad-based image and data-base management, microbiology, and gas analyses with a modified Giggenbach sampler.All springs were characterized using GISRoam (tmCogent3D). We are performing geothermometry calculations as well as comparisons with temperature gradient data on the results while also analyzing biological samples. Analyses include water temperature, pH, electrical conductivity, and TDS measured in the field. Each sample is sealed and chilled and delivered to a water lab within 12 hours.Temperatures are continuously monitored with the use of Solinst Levelogger Juniors. Through partnership with a local community college geology club, we receive results on a monthly basis and are able to process initial data earlier in order to evaluate data over a longer time span. The springs and mudpots contained microbial organisms which were analyzed using methods of single colony isolation, polymerase chain reaction, and DNA sequencing showing the impact of the organisms on the springs or vice versa. Soon we we will collect gas samples at sites that show signs of gas. This will be taken using a hybrid of the Giggenbach method and our own methods. Drawing gas samples has proven a challenge, however we devised a method to draw out gas samples utilizing the Giggenbach flask, transferring samples to glass blood sample tubes, replacing NaOH in the Giggenbach flask, and evacuating it in the field for multiple samples using a vacuum pump. We also use a floating platform devised to carry and lower a levelogger, to using an in-line fuel filter from a tractor in order to keep mud from contaminating the equipment.The use of raster

  5. An Apple II -based bidimensional pulse height analyser

    International Nuclear Information System (INIS)

    Bateman, J.E.; Flesher, A.C.; Honeyman, R.N.; Pritchard, T.E.; Price, W.P.R.

    1984-06-01

    The implementation of a pulse height analyser function in an Apple II microcomputer using minimal purpose built hardware is described. Except for a small interface module the system consists of two suites of software, one giving a conventional one dimensional analysis on a span of 1024 channels, and the other a two dimensional analysis on a 128 x 128 image format. Using the recently introduced ACCELERATOR coprocessor card the system performs with a dead time per event of less than 50 μS. Full software facilities are provided for display, storage and processing of the data using standard Applesoft BASIC. (author)

  6. Applications of MIDAS regression in analysing trends in water quality

    Science.gov (United States)

    Penev, Spiridon; Leonte, Daniela; Lazarov, Zdravetz; Mann, Rob A.

    2014-04-01

    We discuss novel statistical methods in analysing trends in water quality. Such analysis uses complex data sets of different classes of variables, including water quality, hydrological and meteorological. We analyse the effect of rainfall and flow on trends in water quality utilising a flexible model called Mixed Data Sampling (MIDAS). This model arises because of the mixed frequency in the data collection. Typically, water quality variables are sampled fortnightly, whereas the rain data is sampled daily. The advantage of using MIDAS regression is in the flexible and parsimonious modelling of the influence of the rain and flow on trends in water quality variables. We discuss the model and its implementation on a data set from the Shoalhaven Supply System and Catchments in the state of New South Wales, Australia. Information criteria indicate that MIDAS modelling improves upon simplistic approaches that do not utilise the mixed data sampling nature of the data.

  7. Fracture analyses of WWER reactor pressure vessels

    International Nuclear Information System (INIS)

    Sievers, J.; Liu, X.

    1997-01-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab

  8. Fracture analyses of WWER reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Sievers, J; Liu, X [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany)

    1997-09-01

    In the paper first the methodology of fracture assessment based on finite element (FE) calculations is described and compared with simplified methods. The FE based methodology was verified by analyses of large scale thermal shock experiments in the framework of the international comparative study FALSIRE (Fracture Analyses of Large Scale Experiments) organized by GRS and ORNL. Furthermore, selected results from fracture analyses of different WWER type RPVs with postulated cracks under different loading transients are presented. 11 refs, 13 figs, 1 tab.

  9. Program Implementation Plan

    International Nuclear Information System (INIS)

    1987-06-01

    The Program Implementation Plan (PIP) describes the US Department of Energy's (DOE's) current approaches for managing the permanent disposal of defense high-level waste (HLW), transuranic (TRU) waste, and low-level waste (LLW) from atomic energy defense activities. It documents the implementation of the HLW and TRU waste policies as stated in the Defense Waste Management Plan (DWMP) (DOE/DP-0015), dated June 1983, and also addresses the management of LLW. The narrative reflects both accomplishments and changes in the scope of activities. All cost tables and milestone schedules are current as of January 1987. The goals of the program, to provide safe processing and utilization, storage, and disposal of DOE radioactive waste and byproducts to support defense nuclear materials production activities, and to implement cost-effective improvements in all of its ongoing and planned activities, have not changed

  10. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  11. [Anne Arold. Kontrastive Analyse...] / Paul Alvre

    Index Scriptorium Estoniae

    Alvre, Paul, 1921-2008

    2001-01-01

    Arvustus: Arold, Anne. Kontrastive analyse der Wortbildungsmuster im Deutschen und im Estnischen (am Beispiel der Aussehensadjektive). Tartu, 2000. (Dissertationes philologiae germanicae Universitatis Tartuensis)

  12. MGtoolkit: A python package for implementing metagraphs

    Directory of Open Access Journals (Sweden)

    D. Ranathunga

    2017-01-01

    Full Text Available In this paper we present MGtoolkit : an open-source Python package for implementing metagraphs - a first of its kind. Metagraphs are commonly used to specify and analyse business and computer-network policies alike. MGtoolkit can help verify such policies and promotes learning and experimentation with metagraphs. The package currently provides purely textual output for visualising metagraphs and their analysis results.

  13. Implementing Samba 4

    CERN Document Server

    Leal, Marcelo

    2014-01-01

    This book is an implementation tutorial covering step-by-step procedures, examples, and sample code, and has a practical approach to set up a Samba 4 Server as an Active Directory Domain Controller and also set up different Samba 4 server roles.This book is ideal for system administrators who are new to the Samba 4 software, and who are looking to get a good grounding in how to use Samba 4 to implement Active Directory Services. It's assumed that you will have some experience with general system administration, Active Directory, and GNU/Linux systems. Readers are expected to have some test mac

  14. Environmental protection implementation plan

    International Nuclear Information System (INIS)

    Holland, R.C.

    1998-03-01

    This Environmental Protection Implementation Plan is intended to ensure that the environmental program objectives of Department of Energy Order 5400.1 are achieved at SNL/California. This document states SNL/California's commitment to conduct its operations in an environmentally safe and responsible manner. The Environmental Protection Implementation Plan helps management and staff comply with applicable environmental responsibilities. SNL is committed to operating in full compliance with the letter and spirit of applicable environmental laws, regulations, and standards. Furthermore, SNL/California strives to go beyond compliance with legal requirements by making every effort practical to reduce impacts to the environment to levels as low as reasonably achievable

  15. Implementing the legislation

    International Nuclear Information System (INIS)

    Silverstrom, L.

    1982-01-01

    Leon Silverstrom explained how nuclear waste disposal legislation would be implemented. The legislation provides a framework that recognizes the tremendous number of views and opinions on the subject and provides a mechanism that will allow all these interests to be expressed before final decisions are reached. Implementing procedures are outlined for: (1) the final repository; (2) interim or last resort storage; (3) research and development; (4) the monitored retrievable storage phases. The whole process will involve: environmental assessments and licensing requirements for each phase; construction of a test and evaluation facility; provision for sharing information with the states and interested parties; and procedures for public hearings and state rejection of propoped sites

  16. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership.

    Science.gov (United States)

    Aarons, Gregory A; Ehrhart, Mark G; Farahnak, Lauren R

    2014-04-14

    In healthcare and allied healthcare settings, leadership that supports effective implementation of evidenced-based practices (EBPs) is a critical concern. However, there are no empirically validated measures to assess implementation leadership. This paper describes the development, factor structure, and initial reliability and convergent and discriminant validity of a very brief measure of implementation leadership: the Implementation Leadership Scale (ILS). Participants were 459 mental health clinicians working in 93 different outpatient mental health programs in Southern California, USA. Initial item development was supported as part of a two United States National Institutes of Health (NIH) studies focused on developing implementation leadership training and implementation measure development. Clinician work group/team-level data were randomly assigned to be utilized for an exploratory factor analysis (n = 229; k = 46 teams) or for a confirmatory factor analysis (n = 230; k = 47 teams). The confirmatory factor analysis controlled for the multilevel, nested data structure. Reliability and validity analyses were then conducted with the full sample. The exploratory factor analysis resulted in a 12-item scale with four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. Confirmatory factor analysis supported an a priori higher order factor structure with subscales contributing to a single higher order implementation leadership factor. The scale demonstrated excellent internal consistency reliability as well as convergent and discriminant validity. The ILS is a brief and efficient measure of unit level leadership for EBP implementation. The availability of the ILS will allow researchers to assess strategic leadership for implementation in order to advance understanding of leadership as a predictor of organizational context for implementation. The ILS also holds promise as a tool for

  17. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership

    Science.gov (United States)

    2014-01-01

    Background In healthcare and allied healthcare settings, leadership that supports effective implementation of evidenced-based practices (EBPs) is a critical concern. However, there are no empirically validated measures to assess implementation leadership. This paper describes the development, factor structure, and initial reliability and convergent and discriminant validity of a very brief measure of implementation leadership: the Implementation Leadership Scale (ILS). Methods Participants were 459 mental health clinicians working in 93 different outpatient mental health programs in Southern California, USA. Initial item development was supported as part of a two United States National Institutes of Health (NIH) studies focused on developing implementation leadership training and implementation measure development. Clinician work group/team-level data were randomly assigned to be utilized for an exploratory factor analysis (n = 229; k = 46 teams) or for a confirmatory factor analysis (n = 230; k = 47 teams). The confirmatory factor analysis controlled for the multilevel, nested data structure. Reliability and validity analyses were then conducted with the full sample. Results The exploratory factor analysis resulted in a 12-item scale with four subscales representing proactive leadership, knowledgeable leadership, supportive leadership, and perseverant leadership. Confirmatory factor analysis supported an a priori higher order factor structure with subscales contributing to a single higher order implementation leadership factor. The scale demonstrated excellent internal consistency reliability as well as convergent and discriminant validity. Conclusions The ILS is a brief and efficient measure of unit level leadership for EBP implementation. The availability of the ILS will allow researchers to assess strategic leadership for implementation in order to advance understanding of leadership as a predictor of organizational context for implementation

  18. PROJECT IMPLEMENTATION IN ORGANISATIONS OF REPETITIVE ACTIVITIES

    Directory of Open Access Journals (Sweden)

    Marek WIRKUS

    2015-04-01

    Full Text Available The study presents the implementation of projects in organisations that achieve business objectives through the imple-mentation of repetitive actions. Projects in these organisations are, on the one hand, treated as marginal activities, while the results of these projects have significant impact on the delivery of main processes, e.g. through the introduction of new products. Human capital and solutions in this field bear impact on the success of projects in these organisations, which is not always conducive to smooth implementation of projects. Conflict results from the nature of a project, which is a one-time and temporary process, so organisational solutions are also temporary. It influences on attitudes and com-mitment of the project contractors. The paper identifies and analyses factors which affect the success of the projects.

  19. Going above and beyond for implementation: the development and validity testing of the Implementation Citizenship Behavior Scale (ICBS).

    Science.gov (United States)

    Ehrhart, Mark G; Aarons, Gregory A; Farahnak, Lauren R

    2015-05-07

    In line with recent research on the role of the inner context of organizations in implementation effectiveness, this study extends research on organizational citizenship behavior (OCB) to the domain of evidence-based practice (EBP) implementation. OCB encompasses those behaviors that go beyond what is required for a given job that contribute to greater organizational effectiveness. The goal of this study was to develop and test a measure of implementation citizenship behavior (ICB) or those behaviors that employees perform that go above and beyond what is required in order to support EBP implementation. The primary participants were 68 supervisors from ten mental health agencies throughout California. Items measuring ICB were developed based on past research on OCB and in consultation with experts on EBP implementation in mental health settings. Supervisors rated 357 of their subordinates on ICB and implementation success. In addition, 292 of the subordinates provided data on self-rated performance, attitudes towards EBPs, work experience, and full-time status. The supervisor sample was randomly split, with half used for exploratory factor analyses and the other half for confirmatory factor analyses. The entire sample of supervisors and subordinates was utilized for analyses assessing the reliability and construct validity of the measure. Exploratory factor analyses supported the proposed two-factor structure of the Implementation Citizenship Behavior Scale (ICBS): (1) Helping Others and (2) Keeping Informed. Confirmatory factor analyses with the other half of the sample supported the factor structure. Additional analyses supported the reliability and construct validity for the ICBS. The ICBS is a pragmatic brief measure (six items) that captures critical behaviors employees perform to go above and beyond the call of duty to support EBP implementation, including helping their fellow employees on implementation-related activities and keeping informed about issues

  20. Implementation of lean leadership

    Directory of Open Access Journals (Sweden)

    Trenkner Małgorzata

    2016-12-01

    Full Text Available The Toyota case proves that lean leadership is of critical importance for the successful implementation and permanent functioning of Lean Production System. There is no ready formula for developing Toyota style lean leadership. However, one may gain inspiration from its experience.

  1. Distributed Energy Implementation Options

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  2. Implementing the NPT

    International Nuclear Information System (INIS)

    1971-01-01

    In his annual address to the General Assembly of the United Nations, in New York in early November, Dr. Eklund stressed his view that it is essential that the initial momentum in the implementation of the Treaty on the Non-Proliferation of Nuclear Weapons be regained; and he noted that there had been encouraging developments at the IAEA General Conference. (author)

  3. Implementing Modular A Levels.

    Science.gov (United States)

    Holding, Gordon

    This document, which is designed for curriculum managers at British further education (FE) colleges, presents basic information on the implementation and perceived benefits of the General Certificate of Education (GCE) modular A (Advanced) levels. The information was synthesized from a survey of 12 FE colleges that introduced the modular A levels…

  4. Outage scheduling and implementation

    International Nuclear Information System (INIS)

    Allison, J.E.; Segall, P.; Smith, R.R.

    1986-01-01

    Successful preparation and implementation of an outage schedule and completion of scheduled and emergent work within an identified critical path time frame is a result of careful coordination by Operations, Work Control, Maintenance, Engineering, Planning and Administration and others. At the Fast Flux Test Facility (FFTF) careful planning has been responsible for meeting all scheduled outage critical paths

  5. Mojave Base Station Implementation

    Science.gov (United States)

    Koscielski, C. G.

    1984-01-01

    A 12.2 meter diameter X-Y mount antenna was reconditioned for use by the crustal dynamic project as a fixed base station. System capabilities and characteristics and key performance parameters for subsystems are presented. The implementation is completed.

  6. Implementering af evidensbaseret praksis

    DEFF Research Database (Denmark)

    Scheel, Linda Schumann; Tewes, Marianne; Petersen, Preben Ulrich

    2015-01-01

    Projektet omhandler implementering af den kliniske retningslinje om identifikation, forebyggelse og behandling af delirium – i denne sammenhæng relateret til hjertepatienter. Projektet foregår på to hospitaler i Hjertecentret på Rigshospitalet, et intensivt afsnit og et sengeafsnit, i tæt...

  7. Implementering af evidensbaseret praksis

    DEFF Research Database (Denmark)

    Scheel, Linda Schumann; Hansen, Ida Rode

    2014-01-01

    Projektet omhandler implementering af den kliniske retningslinje om identifikation, forebyggelse og behandling af delirium – i denne sammenhæng relateret til hjertepatienter. Projektet foregår på to hospitaler i Hjertecentret på Rigshospitalet, et intensivt afsnit og et sengeafsnit, i tæt...

  8. EPBD implementation in Denmark

    DEFF Research Database (Denmark)

    Wittchen, Kim Bjarne; Thomsen, Kirsten Engelund; Malmsteen, Margit

    2013-01-01

    This report presents an overview of the current status of the implementation of the Directive on the Energy Performance of Buildings (EPBD) in Denmark, as well as plans for its evolution. It addresses the energy requirements, as well as the certification and inspection systems, including quality ...

  9. MONA Implementation Secrets

    DEFF Research Database (Denmark)

    Klarlund, Nils; Møller, Anders; Schwartzbach, Michael Ignatieff

    2002-01-01

    a period of six years. Compared to the first naive version, the present tool is faster by several orders of magnitude. This speedup is obtained from many different contributions working on all levels of the compilation and execution of formulas. We present a selection of implementation "secrets" that have...

  10. MONA Implementation Secrets

    DEFF Research Database (Denmark)

    Klarlund, Nils; Møller, Anders; Schwartzbach, Michael Ignatieff

    2001-01-01

    a period of six years. Compared to the first naive version, the present tool is faster by several orders of magnitude. This speedup is obtained from many different contributions working on all levels of the compilation and execution of formulas. We present a selection of implementation “secrets” that have...

  11. DATA WAREHOUSES SECURITY IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Burtescu Emil

    2009-05-01

    Full Text Available Data warehouses were initially implemented and developed by the big firms and they were used for working out the managerial problems and for making decisions. Later on, because of the economic tendencies and of the technological progress, the data warehou

  12. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  13. Diversity of primary care systems analysed.

    NARCIS (Netherlands)

    Kringos, D.; Boerma, W.; Bourgueil, Y.; Cartier, T.; Dedeu, T.; Hasvold, T.; Hutchinson, A.; Lember, M.; Oleszczyk, M.; Pavlick, D.R.

    2015-01-01

    This chapter analyses differences between countries and explains why countries differ regarding the structure and process of primary care. The components of primary care strength that are used in the analyses are health policy-making, workforce development and in the care process itself (see Fig.

  14. Approximate analyses of inelastic effects in pipework

    International Nuclear Information System (INIS)

    Jobson, D.A.

    1983-01-01

    This presentation shows figures concerned with analyses of inelastic effects in pipework as follows: comparison of experimental and calculated simplified analyses results for free end rotation and for circumferential strain; interrupted stress relaxation; regenerated relaxation caused by reversed yield; buckling of straight pipe under combined bending and torsion; results of fatigues test of pipe bend

  15. Genome-wide DNA polymorphism analyses using VariScan

    Directory of Open Access Journals (Sweden)

    Vilella Albert J

    2006-09-01

    Full Text Available Abstract Background DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. Results We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i exhaustive population-genetic analyses including those based on the coalescent theory; ii analysis adapted to the shallow data generated by the high-throughput genome projects; iii use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v visualization of the results integrated with current genome annotations in commonly available genome browsers. Conclusion VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.

  16. Level II Ergonomic Analyses, Dover AFB, DE

    Science.gov (United States)

    1999-02-01

    IERA-RS-BR-TR-1999-0002 UNITED STATES AIR FORCE IERA Level II Ergonomie Analyses, Dover AFB, DE Andrew Marcotte Marilyn Joyce The Joyce...Project (070401881, Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 4. TITLE AND SUBTITLE Level II Ergonomie Analyses, Dover...1.0 INTRODUCTION 1-1 1.1 Purpose Of The Level II Ergonomie Analyses : 1-1 1.2 Approach 1-1 1.2.1 Initial Shop Selection and Administration of the

  17. Cost-Benefit Analyses of Transportation Investments

    DEFF Research Database (Denmark)

    Næss, Petter

    2006-01-01

    This paper discusses the practice of cost-benefit analyses of transportation infrastructure investment projects from the meta-theoretical perspective of critical realism. Such analyses are based on a number of untenable ontological assumptions about social value, human nature and the natural......-to-pay investigations. Accepting the ontological and epistemological assumptions of cost-benefit analysis involves an implicit acceptance of the ethical and political values favoured by these assumptions. Cost-benefit analyses of transportation investment projects tend to neglect long-term environmental consequences...

  18. Orchestrating Lean Implementation

    DEFF Research Database (Denmark)

    Riis, Jens Ove; Mikkelsen, Hans; Andersen, Jesper Rank

    2008-01-01

    The notion of Lean Manufacturing is not merely confined to a set of well defined techniques, but represents a broad approach to managing a company. Working with lean entails many aspects, such as production planning and control, production engineering, product development, supply chain......, and organizational issues. To become effective, many functional areas and departments must be involved. At the same time companies are embedded in a dynamic environment. The aim of the paper is to propose a comprehensive approach to better implementation of lean initiatives, based on two empirical studies. The paper...... will discuss how a concerted effort can be staged taking into account the interdependencies among individual improvement initiatives. The notion of orchestration will be introduced, and several means for orchestration will be presented. Critical behavioral issues for lean implementation will be discussed....

  19. IT-implementering

    DEFF Research Database (Denmark)

    Kjeldsen, Lars Peter

    2003-01-01

    Formålet er at beskrive og undersøge hvorledes en it-implementering i en organisation kan støttes af de udvikling- og læringsprincipper, der kan hentes i teorier om den lærende organisation. Empirien bygger på et aktionsforksningsprojekt der havde til hensigt at iagtage og begrebsliggøre kommunik......Formålet er at beskrive og undersøge hvorledes en it-implementering i en organisation kan støttes af de udvikling- og læringsprincipper, der kan hentes i teorier om den lærende organisation. Empirien bygger på et aktionsforksningsprojekt der havde til hensigt at iagtage og begrebsliggøre...

  20. Caching Patterns and Implementation

    Directory of Open Access Journals (Sweden)

    Octavian Paul ROTARU

    2006-01-01

    Full Text Available Repetitious access to remote resources, usually data, constitutes a bottleneck for many software systems. Caching is a technique that can drastically improve the performance of any database application, by avoiding multiple read operations for the same data. This paper addresses the caching problems from a pattern perspective. Both Caching and caching strategies, like primed and on demand, are presented as patterns and a pattern-based flexible caching implementation is proposed.The Caching pattern provides method of expensive resources reacquisition circumvention. Primed Cache pattern is applied in situations in which the set of required resources, or at least a part of it, can be predicted, while Demand Cache pattern is applied whenever the resources set required cannot be predicted or is unfeasible to be buffered.The advantages and disadvantages of all the caching patterns presented are also discussed, and the lessons learned are applied in the implementation of the pattern-based flexible caching solution proposed.

  1. The challenge of implementation

    DEFF Research Database (Denmark)

    Andersen, Karen Heide Hauge

    2016-01-01

    of the paper: which factors influence the degree of implementation of innovation and entrepreneurship in the individual lecturers’ daily teaching? The paper questions the common approach taken by higher educational institutions whereby lecturers are urged to teach innovation and entrepreneurship with minor...... these concepts in daily teaching, as it is strongly encouraged by policy makers and educations. This paper aims to discuss how lecturers experience the challenge of teaching their own discipline while being imposed to embrace and promote innovation and entrepreneurship teaching. Through a single study case...... of the BATCoM education at VIA University College, Denmark, the paper shows that the knowledge, use and implementation of the concepts is far from anchored in the lecturers’ daily practices. Through qualitative interviews the paper highlights different aspects considered, to determine the research question...

  2. Radiological control implementation guide

    International Nuclear Information System (INIS)

    Hamley, S.A.

    1993-01-01

    A manual is being developed to explain to line managers how radiological controls are designed and implemented. The manual also fills a gap in the Health Physics literature between textbooks and on-the-floor procedures. It may be helpful to new Health Physicists with little practical experience and to those wishing to improve self-assessment, audit, and appraisal processes. Many audits, appraisals, and evaluations have indicated a need for cultural change, increased vigor and example, and more effective oversight by line management. Inadequate work controls are a frequent and recurring problem identified in occurrence reports and accident investigations. Closer study frequently indicates that many line managers are willing to change and want to achieve excellence, but no effective guidance exists that will enable them to understand and implement a modern radiological control program

  3. Safeguards Implementation at KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Juang; Lee, Sung Ho; Lee, Byung-Doo; Kim, Hyun-Sook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    The main objective of the safeguards implementation activities is to assure that there are no diversions of declared nuclear material and/or no undeclared activity. The purpose of safeguards implementation activities is the assistance facility operators to meet the safeguards criteria set forth by the Atomic Energy Safety Acts and Regulations. In addition, the nuclear material and technology control team has acted as a contact point for domestic and international safeguards inspection activities and for the relevant safeguards cooperation. Domestic inspections were successfully carried out at the KAERI nuclear facilities pursuant to the domestic laws and regulations in parallel with the IAEA safeguards inspections. It is expected that safeguards work will be increased due to the pyro-related facilities such as PRIDE, ACPF and DUPIC, for which the IAEA is making an effort to establish safeguards approach. KAERI will actively cope with the plan of the NSSC by changing its domestic inspection regulations on the accounting and control of nuclear materials.

  4. Clinical implementation of pharmacogenetics.

    Science.gov (United States)

    García-González, Xandra; Cabaleiro, Teresa; Herrero, María José; McLeod, Howard; López-Fernández, Luis A

    2016-03-01

    In the last decade, pharmacogenetic research has been performed in different fields. However, the application of pharmacogenetic findings to clinical practice has not been as fast as desirable. The current situation of clinical implementation of pharmacogenetics is discussed. This review focuses on the advances of pharmacogenomics to individualize cancer treatments, the relationship between pharmacogenetics and pharmacodynamics in the clinical course of transplant patients receiving a combination of immunosuppressive therapy, the needs and barriers facing pharmacogenetic clinical application, and the situation of pharmacogenetic testing in Spain. It is based on lectures presented by speakers of the Clinical Implementation of Pharmacogenetics Symposium at the VII Conference of the Spanish Pharmacogenetics and Pharmacogenomics Society, held in April 20, 2015.

  5. "Implementation and Social Influence"

    OpenAIRE

    Hitoshi Matsushima

    2008-01-01

    This paper incorporates social psychology into implementation theory. Real individuals care not only about their material benefits but also about their social influence in terms of obedience and conformity. Using a continuous time horizon, we demonstrate a method of manipulating the decision-making process, according to which, an uninformed principal utilizes her/his power of social influence to incentivize multiple informed agents to make honest announcements. Following this method, we show ...

  6. Implementing Sustainable Development

    OpenAIRE

    Rydin, Y.

    2002-01-01

    This paper highlights the scope for making progress towards sustainable development through changes in current practices and decision-making processes that do not need international agreements. It outlines seven key areas for improving implementation, including: using monitoring and evaluation (and the information these produce) to change attitudes and behaviour; participation that involves the public constructively; better use of “soft” instruments of persuasion and communication; and ensuri...

  7. Leading change: 3--implementation.

    Science.gov (United States)

    Kerridge, Joanna

    The potential for all staff to contribute to service improvement, irrespective of discipline, role or function, is outlined in the 2011 NHS leadership framework. This advocates developing the skills of the entire workforce to create a climate of continuous service improvement. As nurses are often required to take the lead in managing change in clinical practice, this final article in a three-part series focuses on implementing ande potentia reviewing change.

  8. Reframing implementation as an organisational behaviour problem.

    Science.gov (United States)

    Clay-Williams, Robyn; Braithwaite, Jeffrey

    2015-01-01

    The purpose of this paper is to report on a process evaluation of a randomised controlled trial (RCT) intervention study that tested the effectiveness of classroom- and simulation-based crew resource management courses, alone and in combination, and identifies organisational barriers and facilitators to implementation of team training programmes in healthcare. The RCT design consisted of a before and after study with a team training intervention. Quantitative data were gathered on utility and affective reactions to training, and on teamwork knowledge, attitudes, and behaviours of the learners. A sample of participants was interviewed at the conclusion of the study. Interview responses were analysed, alongside qualitative elements of the classroom course critique, to search for evidence, context, and facilitation clues to the implementation process. The RCT method provided scientifically robust data that supported the benefits of classroom training. Qualitative data identified a number of facilitators to implementation of team training, and shed light on some of the ways that learning was diffused throughout the organisation. Barriers to successful implementation were also identified, including hospital time and resource constraints and poor organisational communication. Quantitative randomised methods have intermittently been used to evaluate team training interventions in healthcare. Despite two decades of team training trials, however, the authors do not know as well as the authors would like what goes on inside the "black box" of such RCTs. While results are usually centred on outcomes, this study also provides insight into the context and mechanisms associated with those outcomes and identifies barriers and facilitators to successful intervention implementation.

  9. Comparison with Russian analyses of meteor impact

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-06-01

    The inversion model for meteor impacts is used to discuss Russian analyses and compare principal results. For common input parameters, the models produce consistent estimates of impactor parameters. Directions for future research are discussed and prioritized.

  10. 7 CFR 94.102 - Analyses available.

    Science.gov (United States)

    2010-01-01

    ... analyses for total ash, fat by acid hydrolysis, moisture, salt, protein, beta-carotene, catalase... glycol, SLS, and zeolex. There are also be tests for starch, total sugars, sugar profile, whey, standard...

  11. Anthocyanin analyses of Vaccinium fruit dietary supplements

    Science.gov (United States)

    Vaccinium fruit ingredients within dietary supplements were identified by comparisons with anthocyanin analyses of known Vaccinium profiles (demonstration of anthocyanin fingerprinting). Available Vaccinium supplements were purchased and analyzed; their anthocyanin profiles (based on HPLC separation...

  12. Analyse of Maintenance Cost in ST

    CERN Document Server

    Jenssen, B W

    2001-01-01

    An analyse has been carried out in ST concerning the total costs for the division. Even though the target was the maintenance costs in ST, the global budget over has been analysed. This has been done since there is close relation between investments & consolidation and the required level for maintenance. The purpose of the analyse was to focus on maintenance cost in ST as a ratio of total maintenance costs over the replacement value of the equipment, and to make some comparisons with other industries and laboratories. Families of equipment have been defined and their corresponding ratios calculated. This first approach gives us some "quantitative" measurements. This analyse should be combined with performance indicators (more "qualitative" measurements) that are telling us how well we are performing. This will help us in defending our budget, make better priorities, and we will satisfy the requirements from our external auditors.

  13. Expectations from implementers

    International Nuclear Information System (INIS)

    Biurrun, E.; Zuidema, P.

    2008-01-01

    Enrique Biurrun (DBE) presented the expectations from the implementer. He explained that the implementer needs a framework to successfully develop a repository which means the definition of requirements and guidance (for repository system development, analysis, licences, etc.) as well as the decision-making process (stepwise approach, roles of different players, etc.). He also needs a reasonable stability of the regulatory system. The regulatory framework should be developed in a clear, reasonable and consistent manner. In the context of the long duration of the project (100 years) there will be technological progress. In that context E. Biurrun asked what is the meaning of best practice. How can one deal with judgmental issues in a step-wise approach? Regulatory criteria and guidance must deal with the repository system for which an iterative process is necessary where dialogue is needed with the regulator despite the need to maintain his independence. The safety case, which is a periodic documentation of the status of the project, must provide a synthesis of the underlying scientific understanding and evidence and becomes part of the design process through feedback. E. Biurrun pointed out that safety is not calculated or assessed, but designed and built into the repository system (by geological and engineered barriers). He stressed the importance of the operational aspects since the implementer has to build and operate the repository safely. He asked the question: is it 'Ethical' to buy 'peace of mind' of some stakeholders with casualties of the implementer's staff because of mining accidents if the repository is left open during a phase of reversibility. The implementer needs dependable criteria, legal security and investment security. He interpreted the 'Precautionary principle' as meaning 'do it now'. Long-lasting solutions are very uncertain. Will we heave the money and the technology to do it later? He made some reflections regarding the ethical need to

  14. Analysing policy interactions for promoting energy efficiency in the Hellenic sectors of buildings and transport

    OpenAIRE

    Dr. Popi KONIDARI; Mrs. Anna FLESSA; Ms. Aliki-Nefeli MAVRAKI; Ms. Eleni-Danai MAVRAKI

    2016-01-01

    Policy interactions are important parameters for the successful implementation of policies, measures and policy instruments. The parallel implementation of a number of policy instruments has the potential to create synergies or conflicts that maximize or prevent the achievement of their anticipated outcomes. This paper analyses three cases of policy interactions between two policy instruments for promoting even more the energy efficiency outcomes in Greece for two sectors, buildings and trans...

  15. A History of Rotorcraft Comprehensive Analyses

    Science.gov (United States)

    Johnson, Wayne

    2013-01-01

    A history of the development of rotorcraft comprehensive analyses is presented. Comprehensive analyses are digital computer programs that calculate the aeromechanical behavior of the rotor and aircraft, bringing together the most advanced models of the geometry, structure, dynamics, and aerodynamics available in rotary wing technology. The development of the major codes of the last five decades from industry, government, and universities is described. A number of common themes observed in this history are discussed.

  16. Safety analyses for reprocessing and waste processing

    International Nuclear Information System (INIS)

    1983-03-01

    Presentation of an incident analysis of process steps of the RP, simplified considerations concerning safety, and safety analyses of the storage and solidification facilities of the RP. A release tree method is developed and tested. An incident analysis of process steps, the evaluation of the SRL-study and safety analyses of the storage and solidification facilities of the RP are performed in particular. (DG) [de

  17. Risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Jehee, J.N.T.; Seebregts, A.J.

    1991-02-01

    Probabilistic risk analyses of nuclear power plants are carried out by systematically analyzing the possible consequences of a broad spectrum of causes of accidents. The risk can be expressed in the probabilities for melt down, radioactive releases, or harmful effects for the environment. Following risk policies for chemical installations as expressed in the mandatory nature of External Safety Reports (EVRs) or, e.g., the publication ''How to deal with risks'', probabilistic risk analyses are required for nuclear power plants

  18. GIS-based Approaches to Catchment Area Analyses of Mass Transit

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2009-01-01

    Catchment area analyses of stops or stations are used to investigate potential number of travelers to public transportation. These analyses are considered a strong decision tool in the planning process of mass transit especially railroads. Catchment area analyses are GIS-based buffer and overlay...... analyses with different approaches depending on the desired level of detail. A simple but straightforward approach to implement is the Circular Buffer Approach where catchment areas are circular. A more detailed approach is the Service Area Approach where catchment areas are determined by a street network...... search to simulate the actual walking distances. A refinement of the Service Area Approach is to implement additional time resistance in the network search to simulate obstacles in the walking environment. This paper reviews and compares the different GIS-based catchment area approaches, their level...

  19. Priming a Pilot Implementation

    DEFF Research Database (Denmark)

    Hansen, Magnus; Ie Pedersen, Maria

    Abstract. We report on the initial findings of an action research study about effects specifications. It is a part of larger IS pilot implementation project conducted in the Danish healthcare sector. Through interviews and a workshop we have identified and specified the main effects that comprise...... the basis for the evaluation of the project. The study indicates that cross-organisational effects specifications cause a significant number of effects. To further prioritize these we argue that both interview and workshop must be facilitated as mutual learning processes between interviewer and interviewee....

  20. Vicious and virtuous cycles in ERP implementation : a case study of interrelations between critical success factors

    NARCIS (Netherlands)

    Akkermans, H.A.; Helden, van K.

    2002-01-01

    ERP implementations are complex undertakings. Recent research has provided us with plausible critical success factors (CSFs) for such implementations. This article describes how one list of CSFs (Somers & Nelson, 2001) was used to analyse and explain project performance in one ERP implementation in

  1. Analysing Feature Model Changes using FMDiff

    NARCIS (Netherlands)

    Dintzner, N.J.R.; Van Deursen, A.; Pinzger, M.

    2015-01-01

    Evolving a large scale, highly variable sys- tems is a challenging task. For such a system, evolution operations often require to update consistently both their implementation and its feature model. In this con- text, the evolution of the feature model closely follows the evolution of the system.

  2. Mass separated neutral particle energy analyser

    International Nuclear Information System (INIS)

    Takeuchi, Hiroshi; Matsuda, Toshiaki; Miura, Yukitoshi; Shiho, Makoto; Maeda, Hikosuke; Hashimoto, Kiyoshi; Hayashi, Kazuo.

    1983-09-01

    A mass separated neutral particle energy analyser which could simultaneously measure hydrogen and deuterium atoms emitted from tokamak plasma was constructed. The analyser was calibrated for the energy and mass separation in the energy range from 0.4 keV to 9 keV. In order to investigate the behavior of deuteron and proton in the JFT-2 tokamak plasma heated with ion cyclotron wave and neutral beam injection, this analyser was installed in JFT-2 tokamak. It was found that the energy spectrum could be determined with sufficient accuracy. The obtained ion temperature and ratio of deuteron and proton density from the energy spectrum were in good agreement with the value deduced from Doppler broadening of TiXIV line and the line intensities of H sub(α) and D sub(α) respectively. (author)

  3. Thermal and stress analyses with ANSYS program

    International Nuclear Information System (INIS)

    Kanoo, Iwao; Kawaguchi, Osamu; Asakura, Junichi.

    1975-03-01

    Some analyses of the heat conduction and elastic/inelastic stresses, carried out in Power Reactor and Nuclear Fuel Development Corporation (PNC) in fiscal 1973 using ANSYS (Engineering Analysis System) program, are summarized. In chapter I, the present state of structural analysis programs available for a FBR (fast breeder reactor) in PNC is explained. Chapter II is a brief description of the ANSYS current status. In chapter III are presented 8 examples of the steady-state and transient thermal analyses for fast-reactor plant components, and in chapter IV 5 examples of the inelastic structural analysis. With the advance in the field of finite element method, its applications in design study should extend progressively in the future. The present report, it is hoped, will contribute as references in similar analyses and at the same time help to understand the deformation and strain behaviors of structures. (Mori, K.)

  4. Periodic safety analyses; Les essais periodiques

    Energy Technology Data Exchange (ETDEWEB)

    Gouffon, A; Zermizoglou, R

    1990-12-01

    The IAEA Safety Guide 50-SG-S8 devoted to 'Safety Aspects of Foundations of Nuclear Power Plants' indicates that operator of a NPP should establish a program for inspection of safe operation during construction, start-up and service life of the plant for obtaining data needed for estimating the life time of structures and components. At the same time the program should ensure that the safety margins are appropriate. Periodic safety analysis are an important part of the safety inspection program. Periodic safety reports is a method for testing the whole system or a part of the safety system following the precise criteria. Periodic safety analyses are not meant for qualification of the plant components. Separate analyses are devoted to: start-up, qualification of components and materials, and aging. All these analyses are described in this presentation. The last chapter describes the experience obtained for PWR-900 and PWR-1300 units from 1986-1989.

  5. A Simple, Reliable Precision Time Analyser

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, B. V.; Nargundkar, V. R.; Subbarao, K.; Kamath, M. S.; Eligar, S. K. [Atomic Energy Establishment Trombay, Bombay (India)

    1966-06-15

    A 30-channel time analyser is described. The time analyser was designed and built for pulsed neutron research but can be applied to other uses. Most of the logic is performed by means of ferrite memory core and transistor switching circuits. This leads to great versatility, low power consumption, extreme reliability and low cost. The analyser described provides channel Widths from 10 {mu}s to 10 ms; arbitrarily wider channels are easily obtainable. It can handle counting rates up to 2000 counts/min in each channel with less than 1% dead time loss. There is a provision for an initial delay equal to 100 channel widths. An input pulse de-randomizer unit using tunnel diodes ensures exactly equal channel widths. A brief description of the principles involved in core switching circuitry is given. The core-transistor transfer loop is compared with the usual core-diode loops and is shown to be more versatile and better adapted to the making of a time analyser. The circuits derived from the basic loop are described. These include the scale of ten, the frequency dividers and the delay generator. The current drivers developed for driving the cores are described. The crystal-controlled clock which controls the width of the time channels and synchronizes the operation of the various circuits is described. The detector pulse derandomizer unit using tunnel diodes is described. The scheme of the time analyser is then described showing how the various circuits can be integrated together to form a versatile time analyser. (author)

  6. Fundamental data analyses for measurement control

    International Nuclear Information System (INIS)

    Campbell, K.; Barlich, G.L.; Fazal, B.; Strittmatter, R.B.

    1987-02-01

    A set of measurment control data analyses was selected for use by analysts responsible for maintaining measurement quality of nuclear materials accounting instrumentation. The analyses consist of control charts for bias and precision and statistical tests used as analytic supplements to the control charts. They provide the desired detection sensitivity and yet can be interpreted locally, quickly, and easily. The control charts provide for visual inspection of data and enable an alert reviewer to spot problems possibly before statistical tests detect them. The statistical tests are useful for automating the detection of departures from the controlled state or from the underlying assumptions (such as normality). 8 refs., 3 figs., 5 tabs

  7. A theoretical framework for analysing preschool teaching

    DEFF Research Database (Denmark)

    Chaiklin, Seth

    2014-01-01

    This article introduces a theoretical framework for analysing preschool teaching as a historically-grounded societal practice. The aim is to present a unified framework that can be used to analyse and compare both historical and contemporary examples of preschool teaching practice within and across...... national traditions. The framework has two main components, an analysis of preschool teaching as a practice, formed in relation to societal needs, and an analysis of the categorical relations which necessarily must be addressed in preschool teaching activity. The framework is introduced and illustrated...

  8. Power System Oscillatory Behaviors: Sources, Characteristics, & Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tuffner, Francis K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dosiek, Luke A. [Union College, Schenectady, NY (United States); Pierre, John W. [Univ. of Wyoming, Laramie, WY (United States)

    2017-05-17

    This document is intended to provide a broad overview of the sources, characteristics, and analyses of natural and forced oscillatory behaviors in power systems. These aspects are necessarily linked. Oscillations appear in measurements with distinguishing characteristics derived from the oscillation’s source. These characteristics determine which analysis methods can be appropriately applied, and the results from these analyses can only be interpreted correctly with an understanding of the oscillation’s origin. To describe oscillations both at their source within a physical power system and within measurements, a perspective from the boundary between power system and signal processing theory has been adopted.

  9. Implementation and Validation of Artificial Intelligence Techniques for Robotic Surgery

    OpenAIRE

    Aarshay Jain; Deepansh Jagotra; Vijayant Agarwal

    2014-01-01

    The primary focus of this study is implementation of Artificial Intelligence (AI) technique for developing an inverse kinematics solution for the Raven-IITM surgical research robot [1]. First, the kinematic model of the Raven-IITM robot was analysed along with the proposed analytical solution [2] for inverse kinematics problem. Next, The Artificial Neural Network (ANN) techniques was implemented. The training data for the same was careful selected by keeping manipulability constraints in mind...

  10. Implementing a nationwide criteria-based emergency medical dispatch system

    DEFF Research Database (Denmark)

    Andersen, Mikkel S; Johnsen, Søren Paaske; Sørensen, Jan Nørtved

    2013-01-01

    A criteria-based nationwide Emergency Medical Dispatch (EMD) system was recently implemented in Denmark. We described the system and studied its ability to triage patients according to the severity of their condition by analysing hospital admission and case-fatality risks.......A criteria-based nationwide Emergency Medical Dispatch (EMD) system was recently implemented in Denmark. We described the system and studied its ability to triage patients according to the severity of their condition by analysing hospital admission and case-fatality risks....

  11. The Implementation of European Regulation of the Financial Sector

    DEFF Research Database (Denmark)

    Jensen, Camilla Hørby; Legind, Nina Dietz

    2009-01-01

    The object of the article is to analyse how the ways of national implementation of different EU-rules - specially MiFID and the consumer credit rules - affect the consumer protection and how this harmonise with th objectives of the rules.......The object of the article is to analyse how the ways of national implementation of different EU-rules - specially MiFID and the consumer credit rules - affect the consumer protection and how this harmonise with th objectives of the rules....

  12. Reasons for Implementing Movement in Kinetic Architecture

    Science.gov (United States)

    Cudzik, Jan; Nyka, Lucyna

    2017-10-01

    The paper gives insights into different forms of movement in contemporary architecture and examines them based on the reasons for their implementation. The main objective of the paper is to determine: the degree to which the complexity of kinematic architecture results from functional and spatial needs and what other motivations there are. The method adopted to investigate these questions involves theoretical studies and comparative analyses of architectural objects with different forms of movement imbedded in their structure. Using both methods allowed delving into reasons that lie behind the implementation of movement in contemporary kinetic architecture. As research shows, there is a constantly growing range of applications with kinematic solutions inserted in buildings’ structures. The reasons for their implementation are manifold and encompass pursuits of functional qualities, environmental performance, spatial effects, social interactions and new aesthetics. In those early projects based on simple mechanisms, the main motives were focused on functional values and in later experiments - on improving buildings’ environmental performance. Additionally, in recent proposals, a significant quest could be detected toward kinematic solutions that are focused on factors related to alternative aesthetics and innovative spatial effects. Research reveals that the more complicated form of movement, the more often the reason for its implementation goes beyond the traditionally understood “function”. However, research also shows that the effects resulting from investigations on spatial qualities of architecture and new aesthetics often appear to provide creative insights into new functionalities in architecture.

  13. Probabilistic and Nonprobabilistic Sensitivity Analyses of Uncertain Parameters

    Directory of Open Access Journals (Sweden)

    Sheng-En Fang

    2014-01-01

    Full Text Available Parameter sensitivity analyses have been widely applied to industrial problems for evaluating parameter significance, effects on responses, uncertainty influence, and so forth. In the interest of simple implementation and computational efficiency, this study has developed two sensitivity analysis methods corresponding to the situations with or without sufficient probability information. The probabilistic method is established with the aid of the stochastic response surface and the mathematical derivation proves that the coefficients of first-order items embody the parameter main effects on the response. Simultaneously, a nonprobabilistic interval analysis based method is brought forward for the circumstance when the parameter probability distributions are unknown. The two methods have been verified against a numerical beam example with their accuracy compared to that of a traditional variance-based method. The analysis results have demonstrated the reliability and accuracy of the developed methods. And their suitability for different situations has also been discussed.

  14. Adaption of the PARCS Code for Core Design Audit Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyong Chol; Lee, Young Jin; Uhm, Jae Beop; Kim, Hyunjik [Nuclear Safety Evaluation, Daejeon (Korea, Republic of); Jeong, Hun Young; Ahn, Seunghoon; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2013-05-15

    The eigenvalue calculation also includes quasi-static core depletion analyses. PARCS has implemented variety of features and has been qualified as a regulatory audit code in conjunction with other NRC thermal-hydraulic codes such as TRACE or RELAP5. In this study, as an adaptation effort for audit applications, PARCS is applied for an audit analysis of a reload core design. The lattice physics code HELIOS is used for cross section generation. PARCS-HELIOS code system has been established as a core analysis tool. Calculation results have been compared on a wide spectrum of calculations such as power distribution, critical soluble boron concentration, and rod worth. A reasonable agreement between the audit calculation and the reference results has been found.

  15. Analyses of bundle experiment data using MATRA-h

    Energy Technology Data Exchange (ETDEWEB)

    Lim, In Cheol; Chea, Hee Taek [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    When the construction and operation license for HANARO was renewed in 1995, 25% of CHF penalty was imposed. The reason for this was that the validation work related to the CHF design calculation was not enough for the assurance of CHF margin. As a part of the works to recover this CHF penalty, MATRA-h was developed by implementing the new correlations for the heat transfer, CHF prediction, subcooled void to the MATRA-a, which is the modified version of COBRA-IV-I done by KAERI. Using MATRA-h, the subchannel analyses for the bundle experiment data were performed. The comparison of the code predictions with the experimental results, it was found that the code would give the conservative predictions as far as the CHF in the bundle geometry is concerned. (author). 12 refs., 25 figs., 16 tabs.

  16. Environmental Implementation Plan

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of the Environmental Implementation Plan (EIP) is to show the current and future (five years) environmental plans from individual site organizations and divisions, as well as site environmental programs and initiatives which are designed to protect the environment and meet or exceed changing environmental/regulatory requirements. Communicating with site organizations, departments, and committees is essential in making the site's environmental-planning process work. The EIP gives the site the what, when, how, and why for environmental requirements. Through teamwork and proactive planning, a partnership for environmental excellence is formed to achieve the site vision for SRS to become the recognized model for Environmental Excellence in the Department of Energy's Nuclear Weapons Complex

  17. Implementing Demons and Ratchets

    Directory of Open Access Journals (Sweden)

    Peter M. Orem

    2017-01-01

    Full Text Available Experimental results show that ratchets may be implemented in semiconductor and chemical systems, bypassing the second law and opening up huge gains in energy production. This paper summarizes or describes experiments and results on systems that effect demons and ratchets operating in chemical or electrical domains. One creates temperature differences that can be harvested by a heat engine. A second produces light with only heat input. A third produces harvestable electrical potential directly. These systems share creating particles in one location, destroying them in another and moving them between locations by diffusion (Brownian motion. All absorb ambient heat as they produce other energy forms. None requires an external hot and cold side. The economic and social impacts of these conversions of ambient heat to work are, of course, well-understood and huge. The experimental results beg for serious work on the chance that they are valid.

  18. Implementing Software Defined Radio

    CERN Document Server

    Grayver, Eugene

    2013-01-01

    Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages. The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture. As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations. Because of the author’s experience overseeing dozens of failed and successful developments, he is able to present many real-life examples. Some of the key concepts covered are: Choosing the right architecture for the market – laboratory, military, or commercial Hardware platforms – FPGAs, GPPs, specialized and hybrid devices Standardization efforts to ens...

  19. Priming a Pilot Implementation

    DEFF Research Database (Denmark)

    Hansen, Magnus Rotvit Perlt; Pedersen, Maria Ie

    2011-01-01

    Abstract. We report on the initial findings of an exploratory action research study about effects specifications using the systems development method Effects-driven IT development. It is part of a larger IS pilot implementation project conducted in the Danish healthcare sector. Through interviews...... and a workshop we have identified and specified effects that comprise the basis for an evaluation of the project between several organisational agents with diverse professional backgrounds. Gathering organisational participants at a workshop with a common goal of discussing and prioritizing a finished list...... of effects has proved to be a valuable approach to create mutual learning process amongst the participants and the facilitators of the workshop. The challenges we experienced during the effects specification process were to balance a dichotomous focus between on one hand the visions of the participants...

  20. Implementation of Emerging Technologies

    DEFF Research Database (Denmark)

    Barba, F. J.; Orlien, Vibeke; Mota, Maria J.

    2016-01-01

    safety are presented and discussed with an ultimate goal to explore strategies for their implementation in the food industry. Novel thermal and nonthermal technologies have shown clear environmental benefits by improving the overall energy efficiency of the process and reducing the use of nonrenewable......Novel processing technologies have been gaining interest among food researchers due to their lower impact on nutritional and sensory properties of the products compared to the conventional thermal techniques. In this chapter some of the most well-studied (eg, high-pressure processing, pulsed...... electric fields, ohmic heating, microwave, and ultrasound) emerging technologies are briefly reviewed. Most of these technologies have found niche applications in the food industry, replacing or complementing conventional preservation technologies. Thereby, data on commercialization, energy, and microbial...

  1. Implementing AIDS Education

    Directory of Open Access Journals (Sweden)

    Grace C. Huerta

    1996-08-01

    Full Text Available The world has been challenged by the AIDS epidemic for 15 years. In 1985, the U.S. Department of Health and Human Services, Centers for Disease Control, allocated funds to all state departments of education to assist schools in the development of AIDS education policies and programs. Yet, these policies do not ensure that all students receive effective AIDS education. On September 21, 1991, the Arizona Legislature passed Senate Bill 1396, which requires public schools to annually provide AIDS education in grades K-12. The bill was rescinded in 1995. With prohibitive curriculum guidelines, limited teacher training opportunities and tremendous instructional demands, this educational policy was implemented in disparate forms. By examining the perspectives of the Arizona educators (representing three school districts, this qualitative study reveals how teachers ultimately controlled the delivery and nature of AIDS instruction based upon personal values, views of teacher roles, and their interpretation of the mandate itself.

  2. Environmental Implementation Plan

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-15

    The purpose of the Environmental Implementation Plan (EIP) is to show the current and future (five years) environmental plans from individual site organizations and divisions, as well as site environmental programs and initiatives which are designed to protect the environment and meet or exceed changing environmental/regulatory requirements. Communicating with site organizations, departments, and committees is essential in making the site's environmental-planning process work. The EIP gives the site the what, when, how, and why for environmental requirements. Through teamwork and proactive planning, a partnership for environmental excellence is formed to achieve the site vision for SRS to become the recognized model for Environmental Excellence in the Department of Energy's Nuclear Weapons Complex.

  3. Environmental Implementation Plan

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-15

    The purpose of the Environmental Implementation Plan (EIP) is to show the current and future (five years) environmental plans from individual site organizations and divisions, as well as site environmental programs and initiatives which are designed to protect the environment and meet or exceed changing environmental/regulatory requirements. Communicating with site organizations, departments, and committees is essential in making the site`s environmental-planning process work. The EIP gives the site the what, when, how, and why for environmental requirements. Through teamwork and proactive planning, a partnership for environmental excellence is formed to achieve the site vision for SRS to become the recognized model for Environmental Excellence in the Department of Energy`s Nuclear Weapons Complex.

  4. 10 CFR 61.13 - Technical analyses.

    Science.gov (United States)

    2010-01-01

    ... air, soil, groundwater, surface water, plant uptake, and exhumation by burrowing animals. The analyses... processes such as erosion, mass wasting, slope failure, settlement of wastes and backfill, infiltration through covers over disposal areas and adjacent soils, and surface drainage of the disposal site. The...

  5. Analysing Simple Electric Motors in the Classroom

    Science.gov (United States)

    Yap, Jeff; MacIsaac, Dan

    2006-01-01

    Electromagnetic phenomena and devices such as motors are typically unfamiliar to both teachers and students. To better visualize and illustrate the abstract concepts (such as magnetic fields) underlying electricity and magnetism, we suggest that students construct and analyse the operation of a simply constructed Johnson electric motor. In this…

  6. En kvantitativ metode til analyse af radio

    Directory of Open Access Journals (Sweden)

    Christine Lejre

    2014-06-01

    Full Text Available I den danske såvel som den internationale radiolitteratur er bud på metoder til analyse af radiomediet sparsomme. Det skyldes formentlig, at radiomediet er svært at analysere, fordi det er et medie, der ikke er visualiseret i form af billeder eller understøttet af printet tekst. Denne artikel har til formål at beskrive en ny kvantitativ metode til analyse af radio, der tager særligt hensyn til radiomediets modalitet – lyd struktureret som et lineært forløb i tid. Metoden understøtter dermed både radiomediet som et medie i tid og som et blindt medie. Metoden er udviklet i forbindelse med en komparativ analyse af kulturprogrammer på P1 og Radio24syv lavet for Danmarks Radio. Artiklen peger på, at metoden er velegnet til analyse af ikke kun radio, men også andre medieplatforme samt forskellige journalistiske stofområder.

  7. Analysing User Lifetime in Voluntary Online Collaboration

    DEFF Research Database (Denmark)

    McHugh, Ronan; Larsen, Birger

    2010-01-01

    This paper analyses persuasion in online collaboration projects. It introduces a set of heuristics that can be applied to such projects and combines these with a quantitative analysis of user activity over time. Two example sites are studies, Open Street Map and The Pirate Bay. Results show that ...

  8. Analyses of hydraulic performance of velocity caps

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Degn Eskesen, Mark Chr.; Buhrkall, Jeppe

    2014-01-01

    The hydraulic performance of a velocity cap has been investigated. Velocity caps are often used in connection with offshore intakes. CFD (computational fluid dynamics) examined the flow through the cap openings and further down into the intake pipes. This was combined with dimension analyses...

  9. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  10. UMTS signal measurements with digital spectrum analysers

    International Nuclear Information System (INIS)

    Licitra, G.; Palazzuoli, D.; Ricci, A. S.; Silvi, A. M.

    2004-01-01

    The launch of the Universal Mobile Telecommunications System (UNITS), the most recent mobile telecommunications standard has imposed the requirement of updating measurement instrumentation and methodologies. In order to define the most reliable measurement procedure, which is aimed at assessing the exposure to electromagnetic fields, modern spectrum analysers' features for correct signal characterisation has been reviewed. (authors)

  11. Hybrid Logical Analyses of the Ambient Calculus

    DEFF Research Database (Denmark)

    Bolander, Thomas; Hansen, Rene Rydhof

    2010-01-01

    In this paper, hybrid logic is used to formulate three control flow analyses for Mobile Ambients, a process calculus designed for modelling mobility. We show that hybrid logic is very well-suited to express the semantic structure of the ambient calculus and how features of hybrid logic can...

  12. Micromechanical photothermal analyser of microfluidic samples

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a micromechanical photothermal analyser of microfluidic samples comprising an oblong micro-channel extending longitudinally from a support element, the micro-channel is made from at least two materials with different thermal expansion coefficients, wherein...

  13. Systematic review and meta-analyses

    DEFF Research Database (Denmark)

    Dreier, Julie Werenberg; Andersen, Anne-Marie Nybo; Berg-Beckhoff, Gabriele

    2014-01-01

    1990 were excluded. RESULTS: The available literature supported an increased risk of adverse offspring health in association with fever during pregnancy. The strongest evidence was available for neural tube defects, congenital heart defects, and oral clefts, in which meta-analyses suggested between a 1...

  14. Secundaire analyses organisatiebeleid psychosociale arbeidsbelasting (PSA)

    NARCIS (Netherlands)

    Kraan, K.O.; Houtman, I.L.D.

    2016-01-01

    Hoe het organisatiebeleid rond psychosociale arbeidsbelasting (PSA) eruit ziet anno 2014 en welke samenhang er is met ander beleid en uitkomstmaten, zijn de centrale vragen in dit onderzoek. De resultaten van deze verdiepende analyses kunnen ten goede komen aan de lopende campagne ‘Check je

  15. Exergoeconomic and environmental analyses of CO

    NARCIS (Netherlands)

    Mosaffa, A. H.; Garousi Farshi, L; Infante Ferreira, C.A.; Rosen, M. A.

    2016-01-01

    Exergoeconomic and environmental analyses are presented for two CO2/NH3 cascade refrigeration systems equipped with (1) two flash tanks and (2) a flash tank along with a flash intercooler with indirect subcooler. A comparative study is performed for the proposed systems, and

  16. Meta-analyses on viral hepatitis

    DEFF Research Database (Denmark)

    Gluud, Lise L; Gluud, Christian

    2009-01-01

    This article summarizes the meta-analyses of interventions for viral hepatitis A, B, and C. Some of the interventions assessed are described in small trials with unclear bias control. Other interventions are supported by large, high-quality trials. Although attempts have been made to adjust...

  17. Multivariate differential analyses of adolescents' experiences of ...

    African Journals Online (AJOL)

    Aggression is reasoned to be dependent on aspects such as self-concept, moral reasoning, communication, frustration tolerance and family relationships. To analyse the data from questionnaires of 101 families (95 adolescents, 95 mothers and 91 fathers) Cronbach Alpha, various consecutive first and second order factor ...

  18. Chromosomal evolution and phylogenetic analyses in Tayassu ...

    Indian Academy of Sciences (India)

    Chromosome preparation and karyotype description. The material analysed consists of chromosome preparations of the tayassuid species T. pecari (three individuals) and. P. tajacu (four individuals) and were made from short-term lymphocyte cultures of whole blood samples using standard protocols (Chaves et al. 2002).

  19. Grey literature in meta-analyses.

    Science.gov (United States)

    Conn, Vicki S; Valentine, Jeffrey C; Cooper, Harris M; Rantz, Marilyn J

    2003-01-01

    In meta-analysis, researchers combine the results of individual studies to arrive at cumulative conclusions. Meta-analysts sometimes include "grey literature" in their evidential base, which includes unpublished studies and studies published outside widely available journals. Because grey literature is a source of data that might not employ peer review, critics have questioned the validity of its data and the results of meta-analyses that include it. To examine evidence regarding whether grey literature should be included in meta-analyses and strategies to manage grey literature in quantitative synthesis. This article reviews evidence on whether the results of studies published in peer-reviewed journals are representative of results from broader samplings of research on a topic as a rationale for inclusion of grey literature. Strategies to enhance access to grey literature are addressed. The most consistent and robust difference between published and grey literature is that published research is more likely to contain results that are statistically significant. Effect size estimates of published research are about one-third larger than those of unpublished studies. Unfunded and small sample studies are less likely to be published. Yet, importantly, methodological rigor does not differ between published and grey literature. Meta-analyses that exclude grey literature likely (a) over-represent studies with statistically significant findings, (b) inflate effect size estimates, and (c) provide less precise effect size estimates than meta-analyses including grey literature. Meta-analyses should include grey literature to fully reflect the existing evidential base and should assess the impact of methodological variations through moderator analysis.

  20. 78 FR 24290 - Furlough Implementation

    Science.gov (United States)

    2013-04-24

    ... American public and aviation industry of the FAA's Aviation Safety Office's (AVS) furlough implementation... to implement furloughs. AVS and its Services/Offices will implement the required 11 days of furlough beginning April 21, 2013 and continuing through September 30, 2013. AVS will continue to focus resources on...

  1. Material model for non-linear finite element analyses of large concrete structures

    NARCIS (Netherlands)

    Engen, Morten; Hendriks, M.A.N.; Øverli, Jan Arve; Åldstedt, Erik; Beushausen, H.

    2016-01-01

    A fully triaxial material model for concrete was implemented in a commercial finite element code. The only required input parameter was the cylinder compressive strength. The material model was suitable for non-linear finite element analyses of large concrete structures. The importance of including

  2. Educational Equality in China: Analysing Educational Policies for Migrant Children in Beijing

    Science.gov (United States)

    Liu, Shuiyun; Liu, Fuxing; Yu, Yafeng

    2017-01-01

    This paper focuses on the education of migrant children in Beijing. As of the late 1990s, the Chinese Government has developed several policies to address educational issues among migrant children. The present study analyses data from interviews with key education personnel in Beijing to explore the outcomes of the implementation of such migrant…

  3. Implementation of BNCT treatment planning procedures

    International Nuclear Information System (INIS)

    Capala, J.; Ma, R.; Diaz, A.Z.; Chanana, A.D.; Coderre, J.A.

    2001-01-01

    Estimation of radiation doses delivered during boron neutron capture therapy (BNCT) requires combining data on spatial distribution of both the thermal neutron fluence and the 10 B concentration, as well as the relative biological effectiveness of various radiation dose components in the tumor and normal tissues. Using the treatment planning system created at Idaho National Engineering and Environmental Laboratory and the procedures we had developed for clinical trials, we were able to optimize the treatment position, safely deliver the prescribed BNCT doses, and carry out retrospective analyses and reviews. In this paper we describe the BNCT treatment planning process and its implementation in the ongoing dose escalation trials at Brookhaven National Laboratory. (author)

  4. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  5. Implementational issues in CACSD

    DEFF Research Database (Denmark)

    Torp, Steffen; Nørgård, Peter Magnus; Christensen, Anders

    1994-01-01

    The paper describes design considerations for a program for real-time testing of control algorithms in a laboratory environment. The algorithms are developed and tested using simulation in the MATLAB environment. The real-time code is built from the structure of the MATLAB script file using...... a matrix library with interface functions to MATLAB data files. Three real-time hardware platforms are analysed with respect to deriving a device independent program structure, facilitating portability among the three platforms and supporting portability to new platforms. The three platforms are......: a transputer based system, an ADSP21020 based DSP system, and a MC 68030 based VME-bus system. The programming language is ANSI C...

  6. Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship.

    Science.gov (United States)

    Lyon, Aaron R; Cook, Clayton R; Brown, Eric C; Locke, Jill; Davis, Chayna; Ehrhart, Mark; Aarons, Gregory A

    2018-01-08

    A substantial literature has established the role of the inner organizational setting on the implementation of evidence-based practices in community contexts, but very little of this research has been extended to the education sector, one of the most common settings for the delivery of mental and behavioral health services to children and adolescents. The current study examined the factor structure, psychometric properties, and interrelations of an adapted set of pragmatic organizational instruments measuring key aspects of the organizational implementation context in schools: (1) strategic implementation leadership, (2) strategic implementation climate, and (3) implementation citizenship behavior. The Implementation Leadership Scale (ILS), Implementation Climate Scale (ICS), and Implementation Citizenship Behavior Scale (ICBS) were adapted by a research team that included the original scale authors and experts in the implementation of evidence-based practices in schools. These instruments were then administered to a geographically representative sample (n = 196) of school-based mental/behavioral health consultants to assess the reliability and structural validity via a series of confirmatory factor analyses. Overall, the original factor structures for the ILS, ICS, and ICBS were confirmed in the current sample. The one exception was poor functioning of the Rewards subscale of the ICS, which was removed in the final ICS model. Correlations among the revised measures, evaluated as part of an overarching model of the organizational implementation context, indicated both unique and shared variance. The current analyses suggest strong applicability of the revised instruments to implementation of evidence-based mental and behavioral practices in the education sector. The one poorly functioning subscale (Rewards on the ICS) was attributed to typical educational policies that do not allow for individual financial incentives to personnel. Potential directions for

  7. [Implementing evidence and implementation research: two different and prime realities].

    Science.gov (United States)

    Rumbo Prieto, José María; Martínez Ques, Ángel Alfredo; Sobrido Prieto, María; Raña Lama, Camilo Daniel; Vázquez Campo, Miriam; Braña Marcos, Beatriz

    Scientific research can contribute to more efficient health care, enhance care quality and safety of persons. In order for this to happen, the knowledge gained must be put into practice. Implementation is known as the introduction of a change or innovation to daily practice, which requires effective communication and the elimination of barriers that hinder this process. Best practice implementation experiences are being used increasingly in the field of nursing. The difficulty in identifying the factors that indicate the success or failure of implementation has led to increased studies to build a body of differentiated knowledge, recognized as implementation science or implementation research. Implementation research is the scientific study whose objective is the adoption and systematic incorporation of research findings into clinical practice to improve the quality and efficiency of health services. The purpose of implementation research is to improve the health of the population through equitable and effective implementation of rigorously evaluated scientific knowledge, which involves gathering the evidence that has a positive impact on the health of the community. In this text, we set out the characteristics of nursing implementation research, providing a synthesis of different methods, theories, key frameworks and implementation strategies, along with the terminology proposed for greater conceptual clarity. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  8. Understanding implementation in complex public organizations – implication for practice

    Directory of Open Access Journals (Sweden)

    Gry Cecilie Høiland

    2016-10-01

    Full Text Available The effective implementation of politically initiated public service innovations to the front-lines of the public service organization, where the innovation is to be applied, is a challenge that both practitioners and researchers struggle to solve. We highlight the importance of analysing contextual factors at several levels of the implementation system, as well as the importance of considering how the practical everyday work situations of the front-line workers influence their application of the innovation in question. We illustrate this by exploring the implementation process of a specific work inclusion measure, looking at its wider context and some of its implementation outcomes at a specific public agency. The intention is to illustrate the significance of considering the contextual complexity influencing implementation work as a reminder for practitioners to take this into account in their planning and practices.

  9. Analyses and characterization of double shell tank

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-04

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams.

  10. DCH analyses using the CONTAIN code

    International Nuclear Information System (INIS)

    Hong, Sung Wan; Kim, Hee Dong

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of 'DCH issue resolution for ice condenser plants' which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author)

  11. DCH analyses using the CONTAIN code

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Wan; Kim, Hee Dong [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-08-01

    This report describes CONTAIN analyses performed during participation in the project of `DCH issue resolution for ice condenser plants` which is sponsored by NRC at SNL. Even though the calculations were performed for the Ice Condenser plant, CONTAIN code has been used for analyses of many phenomena in the PWR containment and the DCH module can be commonly applied to any plant types. The present ice condenser issue resolution effort intended to provide guidance as to what might be needed to resolve DCH for ice condenser plants. It includes both a screening analysis and a scoping study if the screening analysis cannot provide an complete resolution. The followings are the results concerning DCH loads in descending order. 1. Availability of ignition sources prior to vessel breach 2. availability and effectiveness of ice in the ice condenser 3. Loads modeling uncertainties related to co-ejected RPV water 4. Other loads modeling uncertainties 10 tabs., 3 figs., 14 refs. (Author).

  12. Analyses and characterization of double shell tank

    International Nuclear Information System (INIS)

    1994-01-01

    Evaporator candidate feed from tank 241-AP-108 (108-AP) was sampled under prescribed protocol. Physical, inorganic, and radiochemical analyses were performed on tank 108-AP. Characterization of evaporator feed tank waste is needed primarily for an evaluation of its suitability to be safely processed through the evaporator. Such analyses should provide sufficient information regarding the waste composition to confidently determine whether constituent concentrations are within not only safe operating limits, but should also be relevant to functional limits for operation of the evaporator. Characterization of tank constituent concentrations should provide data which enable a prediction of where the types and amounts of environmentally hazardous waste are likely to occur in the evaporator product streams

  13. Soil analyses by ICP-MS (Review)

    International Nuclear Information System (INIS)

    Yamasaki, Shin-ichi

    2000-01-01

    Soil analyses by inductively coupled plasma mass spectrometry (ICP-MS) are reviewed. The first half of the paper is devoted to the development of techniques applicable to soil analyses, where diverse analytical parameters are carefully evaluated. However, the choice of soil samples is somewhat arbitrary, and only a limited number of samples (mostly reference materials) are examined. In the second half, efforts are mostly concentrated on the introduction of reports, where a large number of samples and/or very precious samples have been analyzed. Although the analytical techniques used in these reports are not necessarily novel, valuable information concerning such topics as background levels of elements in soils, chemical forms of elements in soils and behavior of elements in soil ecosystems and the environment can be obtained. The major topics discussed are total elemental analysis, analysis of radionuclides with long half-lives, speciation, leaching techniques, and isotope ratio measurements. (author)

  14. Sorption analyses in materials science: selected oxides

    International Nuclear Information System (INIS)

    Fuller, E.L. Jr.; Condon, J.B.; Eager, M.H.; Jones, L.L.

    1981-01-01

    Physical adsorption studies have been shown to be extremely valuable in studying the chemistry and structure of dispersed materials. Many processes rely on the access to the large amount of surface made available by the high degree of dispersion. Conversely, there are many applications where consolidation of the dispersed solids is required. Several systems (silica gel, alumina catalysts, mineralogic alumino-silicates, and yttrium oxide plasters) have been studied to show the type and amount of chemical and structural information that can be obtained. Some review of current theories is given and additional concepts are developed based on statistical and thermodynamic arguments. The results are applied to sorption data to show that detailed sorption analyses are extremely useful and can provide valuable information that is difficult to obtain by any other means. Considerable emphasis has been placed on data analyses and interpretation of a nonclassical nature to show the potential of such studies that is often not recognized nor utilized

  15. Standardized analyses of nuclear shipping containers

    International Nuclear Information System (INIS)

    Parks, C.V.; Hermann, O.W.; Petrie, L.M.; Hoffman, T.J.; Tang, J.S.; Landers, N.F.; Turner, W.D.

    1983-01-01

    This paper describes improved capabilities for analyses of nuclear fuel shipping containers within SCALE -- a modular code system for Standardized Computer Analyses for Licensing Evaluation. Criticality analysis improvements include the new KENO V, a code which contains an enhanced geometry package and a new control module which uses KENO V and allows a criticality search on optimum pitch (maximum k-effective) to be performed. The SAS2 sequence is a new shielding analysis module which couples fuel burnup, source term generation, and radial cask shielding. The SAS5 shielding sequence allows a multidimensional Monte Carlo analysis of a shipping cask with code generated biasing of the particle histories. The thermal analysis sequence (HTAS1) provides an easy-to-use tool for evaluating a shipping cask response to the accident capability of the SCALE system to provide the cask designer or evaluator with a computational system that provides the automated procedures and easy-to-understand input that leads to standarization

  16. Quantitative Analyse und Visualisierung der Herzfunktionen

    Science.gov (United States)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  17. Exergetic and thermoeconomic analyses of power plants

    International Nuclear Information System (INIS)

    Kwak, H.-Y.; Kim, D.-J.; Jeon, J.-S.

    2003-01-01

    Exergetic and thermoeconomic analyses were performed for a 500-MW combined cycle plant. In these analyses, mass and energy conservation laws were applied to each component of the system. Quantitative balances of the exergy and exergetic cost for each component, and for the whole system was carefully considered. The exergoeconomic model, which represented the productive structure of the system considered, was used to visualize the cost formation process and the productive interaction between components. The computer program developed in this study can determine the production costs of power plants, such as gas- and steam-turbines plants and gas-turbine cogeneration plants. The program can be also be used to study plant characteristics, namely, thermodynamic performance and sensitivity to changes in process and/or component design variables

  18. Pratique de l'analyse fonctionelle

    CERN Document Server

    Tassinari, Robert

    1997-01-01

    Mettre au point un produit ou un service qui soit parfaitement adapté aux besoins et aux exigences du client est indispensable pour l'entreprise. Pour ne rien laisser au hasard, il s'agit de suivre une méthodologie rigoureuse : celle de l'analyse fonctionnelle. Cet ouvrage définit précisément cette méthode ainsi que ses champs d'application. Il décrit les méthodes les plus performantes en termes de conception de produit et de recherche de qualité et introduit la notion d'analyse fonctionnelle interne. Un ouvrage clé pour optimiser les processus de conception de produit dans son entreprise. -- Idées clés, par Business Digest

  19. Kinetic stability analyses in a bumpy cylinder

    International Nuclear Information System (INIS)

    Dominguez, R.R.; Berk, H.L.

    1981-01-01

    Recent interest in the ELMO Bumpy Torus (EBT) has prompted a number of stability analyses of both the hot electron rings and the toroidal plasma. Typically these works employ the local approximation, neglecting radial eigenmode structure and ballooning effects to perform the stability analysis. In the present work we develop a fully kinetic formalism for performing nonlocal stability analyses in a bumpy cylinder. We show that the Vlasov-Maxwell integral equations (with one ignorable coordinate) are self-adjoint and hence amenable to analysis using numerical techniques developed for self-adjoint systems of equations. The representation we obtain for the kernel of the Vlasov-Maxwell equations is a differential operator of arbitrarily high order. This form leads to a manifestly self-adjoint system of differential equations for long wavelength modes

  20. Sectorial Group for Incident Analyses (GSAI)

    International Nuclear Information System (INIS)

    Galles, Q.; Gamo, J. M.; Jorda, M.; Sanchez-Garrido, P.; Lopez, F.; Asensio, L.; Reig, J.

    2013-01-01

    In 2008, the UNESA Nuclear Energy Committee (CEN) proposed the creation of a working group formed by experts from all Spanish NPPs with the purpose of jointly analyze relevant incidents occurred in each one of the plants. This initiative was a response to a historical situation in which the exchange of information on incidents between the Spanish NPP's was below the desired level. In june 2009, UNESA's Guide CEN-29 established the performance criteria for the so called Sectorial Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group for Incident Analyses (GSAI), whose activity would be coordinated by the UNESA's Group of Operating Experience, under the Operations Commission (COP). (Author)

  1. Analyses of cavitation instabilities in ductile metals

    DEFF Research Database (Denmark)

    Tvergaard, Viggo

    2007-01-01

    Cavitation instabilities have been predicted for a single void in a ductile metal stressed under high triaxiality conditions. In experiments for a ceramic reinforced by metal particles a single dominant void has been observed on the fracture surface of some of the metal particles bridging a crack......, and also tests for a thin ductile metal layer bonding two ceramic blocks have indicated rapid void growth. Analyses for these material configurations are discussed here. When the void radius is very small, a nonlocal plasticity model is needed to account for observed size-effects, and recent analyses......, while the surrounding voids are represented by a porous ductile material model in terms of a field quantity that specifies the variation of the void volume fraction in the surrounding metal....

  2. Analysing organic transistors based on interface approximation

    International Nuclear Information System (INIS)

    Akiyama, Yuto; Mori, Takehiko

    2014-01-01

    Temperature-dependent characteristics of organic transistors are analysed thoroughly using interface approximation. In contrast to amorphous silicon transistors, it is characteristic of organic transistors that the accumulation layer is concentrated on the first monolayer, and it is appropriate to consider interface charge rather than band bending. On the basis of this model, observed characteristics of hexamethylenetetrathiafulvalene (HMTTF) and dibenzotetrathiafulvalene (DBTTF) transistors with various surface treatments are analysed, and the trap distribution is extracted. In turn, starting from a simple exponential distribution, we can reproduce the temperature-dependent transistor characteristics as well as the gate voltage dependence of the activation energy, so we can investigate various aspects of organic transistors self-consistently under the interface approximation. Small deviation from such an ideal transistor operation is discussed assuming the presence of an energetically discrete trap level, which leads to a hump in the transfer characteristics. The contact resistance is estimated by measuring the transfer characteristics up to the linear region

  3. New environmental metabarcodes for analysing soil DNA

    DEFF Research Database (Denmark)

    Epp, Laura S.; Boessenkool, Sanne; Bellemain, Eva P.

    2012-01-01

    was systematically evaluated by (i) in silico PCRs using all standard sequences in the EMBL public database as templates, (ii) in vitro PCRs of DNA extracts from surface soil samples from a site in Varanger, northern Norway and (iii) in vitro PCRs of DNA extracts from permanently frozen sediment samples of late......Metabarcoding approaches use total and typically degraded DNA from environmental samples to analyse biotic assemblages and can potentially be carried out for any kinds of organisms in an ecosystem. These analyses rely on specific markers, here called metabarcodes, which should be optimized...... for taxonomic resolution, minimal bias in amplification of the target organism group and short sequence length. Using bioinformatic tools, we developed metabarcodes for several groups of organisms: fungi, bryophytes, enchytraeids, beetles and birds. The ability of these metabarcodes to amplify the target groups...

  4. Visuelle Analyse von E-mail-Verkehr

    OpenAIRE

    Mansmann, Florian

    2003-01-01

    Diese Arbeit beschreibt Methoden zur visuellen geographischen Analyse von E-mail Verkehr.Aus dem Header einer E-mail können Hostadressen und IP-Adressen herausgefiltert werden. Anhand einer Datenbank werden diesen Host- und IP-Adressen geographische Koordinaten zugeordnet.Durch eine Visualisierung werden in übersichtlicher Art und Weise mehrere tausend E-mail Routen dargestellt. Zusätzlich dazu wurden interktive Manipulationsmöglichkeiten vorgestellt, welche eine visuelle Exploration der Date...

  5. En Billig GPS Data Analyse Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Christiansen, Nick; Larsen, Niels T.

    2011-01-01

    Denne artikel præsenterer en komplet software platform til analyse af GPS data. Platformen er bygget udelukkende vha. open-source komponenter. De enkelte komponenter i platformen beskrives i detaljer. Fordele og ulemper ved at bruge open-source diskuteres herunder hvilke IT politiske tiltage, der...... organisationer med et digitalt vejkort og GPS data begynde at lave trafikanalyser på disse data. Det er et krav, at der er passende IT kompetencer tilstede i organisationen....

  6. Neuronal network analyses: premises, promises and uncertainties

    OpenAIRE

    Parker, David

    2010-01-01

    Neuronal networks assemble the cellular components needed for sensory, motor and cognitive functions. Any rational intervention in the nervous system will thus require an understanding of network function. Obtaining this understanding is widely considered to be one of the major tasks facing neuroscience today. Network analyses have been performed for some years in relatively simple systems. In addition to the direct insights these systems have provided, they also illustrate some of the diffic...

  7. Modelling and analysing oriented fibrous structures

    International Nuclear Information System (INIS)

    Rantala, M; Lassas, M; Siltanen, S; Sampo, J; Takalo, J; Timonen, J

    2014-01-01

    A mathematical model for fibrous structures using a direction dependent scaling law is presented. The orientation of fibrous nets (e.g. paper) is analysed with a method based on the curvelet transform. The curvelet-based orientation analysis has been tested successfully on real data from paper samples: the major directions of fibrefibre orientation can apparently be recovered. Similar results are achieved in tests on data simulated by the new model, allowing a comparison with ground truth

  8. Kinematic gait analyses in healthy Golden Retrievers

    OpenAIRE

    Silva, Gabriela C.A.; Cardoso, Mariana Trés; Gaiad, Thais P.; Brolio, Marina P.; Oliveira, Vanessa C.; Assis Neto, Antonio; Martins, Daniele S.; Ambrósio, Carlos E.

    2014-01-01

    Kinematic analysis relates to the relative movement between rigid bodies and finds application in gait analysis and other body movements, interpretation of their data when there is change, determines the choice of treatment to be instituted. The objective of this study was to standardize the march of Dog Golden Retriever Healthy to assist in the diagnosis and treatment of musculoskeletal disorders. We used a kinematic analysis system to analyse the gait of seven dogs Golden Retriever, female,...

  9. Evaluation of periodic safety status analyses

    International Nuclear Information System (INIS)

    Faber, C.; Staub, G.

    1997-01-01

    In order to carry out the evaluation of safety status analyses by the safety assessor within the periodical safety reviews of nuclear power plants safety goal oriented requirements have been formulated together with complementary evaluation criteria. Their application in an inter-disciplinary coopertion covering the subject areas involved facilitates a complete safety goal oriented assessment of the plant status. The procedure is outlined briefly by an example for the safety goal 'reactivity control' for BWRs. (orig.) [de

  10. Environmental Implementation Plan

    International Nuclear Information System (INIS)

    1994-02-01

    The Environmental Implementation Plan (EIP) is a dynamic long-range environmental-protection plan for SRS. The EIP communicates the current and future (five year) environmental plans from individual organizations and divisions as well as site environmental initiatives which are designed to protect the environment and meet or exceed compliance with changing environmental/ regulatory requirements. Communication with all site organizations is essential for making the site environmental planning process work. Demonstrating environmental excellence is a high priority embodied in DOE and WSRC policy. Because of your support and participation in the three EIP initiatives; Reflections, Sectional Revision, and Integrated Planning, improvements are being made to the EIP and SRS environmental protection programs. I appreciate the ''Partnership in Environmental Excellence'' formed by the environmental coordinators and professionals who work daily toward our goal of compliance and environmental excellence. I look forward to seeing continued success and improvement in our environmental protection programs through combined efforts of all site organizations to protect our employees, the public health, and the environment. Together, we will achieve our site vision for SRS to be the recognized model for Environmental Excellence in the DOE Nuclear Weapons Complex

  11. Future of fusion implementation

    International Nuclear Information System (INIS)

    Beardsworth, E.; Powell, J.R.

    1978-01-01

    For fusion to become available for commercial use in the 21st century, R and D must be undertaken now. But it is hard to justify these expenditures with a cost/benefit oriented assessment methodology, because of both the time-frame and the uncertainty of the future benefits. Focusing on the factors most relevant for current consideration of fusion's commercial prospects, i.e., consumption levels and the outcomes for fission, solar, and coal, many possible futures of the US energy system are posited and analyzed under various assumptions about costs. The Reference Energy System approach was modified to establish both an appropriate degree of detail and explicit time dependence, and a computer code used to organize the relevant data and to perform calculations of system cost (annual and discounted present value), resource use, and residuals that are implied by the consumptions levels and technology mix in each scenario. Not unreasonable scenarios indicate benefits in the form of direct cost savings, which may well exceed R and D costs, which could be attributed to the implementation of fusion

  12. BUC implementation in Slovakia

    International Nuclear Information System (INIS)

    Chrapciak, V.; Vaclav, J.

    2009-01-01

    Improved calculation methods allow one to take credit for the reactivity reduction associated with fuel burnup. This means reducing the analysis conservatism while maintaining an adequate criticality safety margin. Application of burnup credit requires knowledge of the reactivity state of the irradiated fuel for which application of burnup credit is taken. The isotopic inventory and reactivity has to be calculated with validated codes. We use in Slovakia Gd2 fuel with maximal enrichment of fuel pins 4.4%. Our transport and storage basket KZ-48 with boron steel is licensed for fresh fuel with enrichment 4.4%. In near future (2011 or 2012) we will use a new fuel with maximal enrichment of fuel pins 4.9%. For this fuel we plan to use existing KZ-48 with application of burnup credit application. In cooperation with Slovak Nuclear Regulatory Authority we have started several years ago process of application of burnup credit implementation in Slovakia for WWER-440 reactors. We have already prepared methodology according IAEA methodology. We have validated computational systems (SCALE 5.1 already, SCALE 6 in progress). Slovak Nuclear Regulatory Authority will prepare regulation about application of burnup credit application in Slovakia. Last item is preparation of safety reports (for transport and storage) for the new fuel with average enrichment 4.87% in basket KZ-48 with application of burnup credit application. (Authors)

  13. BUC implementation in Slovakia

    International Nuclear Information System (INIS)

    Chrapciak, V.; Vaclav, J.

    2009-01-01

    Improved calculation methods allow one to take credit for the reactivity reduction associated with fuel burnup. This means reducing the analysis conservatism while maintaining an adequate criticality safety margin. Application of burnup credit (BUC) requires knowledge of the reactivity state of the irradiated fuel for which BUC is taken. The isotopic inventory and reactivity has to be calculated with validated codes. We use in Slovakia Gd 2 fuel with maximal enrichment of fuel pins 4.4%. Our transport and storage basket KZ-48 with boron steel is licensed for fresh fuel with enrichment 4.4%. In near future (2011 or 2012) we will use a new fuel with maximal enrichment of fuel pins 4.9%. For this fuel we plan to use existing KZ-48 with BUC application. In cooperation with Slovak Nuclear Regulatory Authority (UJD) we have started several years ago process of BUC implementation in Slovakia for VVER-440 reactors. We have already prepared methodology according IAEA methodology. We have validated computational systems (SCALE 5.1 already, SCALE 6 in progress). UJD will prepare regulation about BUC application in Slovakia. Last item is preparation of safety reports (for transport and storage) for the new fuel with average enrichment 4.87% in basket KZ-48 with BUC application.

  14. Interim district energy implementation

    Energy Technology Data Exchange (ETDEWEB)

    Fearnley, R.; Susak, W. [City of Vancouver, BC (Canada); Johnstone, I. [BCG Services Inc., Vancouver, BC (Canada)

    2001-07-01

    The concept of district energy was introduced in the City of North Vancouver, a city of 45,000, in 1997. A preliminary study was completed in 1997, followed by a tour of some district energy facilities in Finland in the same year. In 1999 a large district energy study was completed by a consultant. The study indicated the need for an investment of $15 million to implement district heating in the City. Lack of sufficient financial resources and immediately connectable heat load, the project was considered a non-starter. Some of the other factors leading to shelving the project included no current significant pricing advantages over competing energy sources and no current opportunity for cogeneration, given the low price that BC Hydro is willing to pay for independently produced power. The project, although shelved for the moment, has not been discarded. Planning and exploration are continuing, aided by the City's commitment to energy efficiency and conservation, its long term planning horizon and its significant influence over the development of some prime real estate.

  15. Environmental Implementation Plan

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    The Environmental Implementation Plan (EIP) is a dynamic long-range environmental-protection plan for SRS. The EIP communicates the current and future (five year) environmental plans from individual organizations and divisions as well as site environmental initiatives which are designed to protect the environment and meet or exceed compliance with changing environmental/ regulatory requirements. Communication with all site organizations is essential for making the site environmental planning process work. Demonstrating environmental excellence is a high priority embodied in DOE and WSRC policy. Because of your support and participation in the three EIP initiatives; Reflections, Sectional Revision, and Integrated Planning, improvements are being made to the EIP and SRS environmental protection programs. I appreciate the ``Partnership in Environmental Excellence`` formed by the environmental coordinators and professionals who work daily toward our goal of compliance and environmental excellence. I look forward to seeing continued success and improvement in our environmental protection programs through combined efforts of all site organizations to protect our employees, the public health, and the environment. Together, we will achieve our site vision for SRS to be the recognized model for Environmental Excellence in the DOE Nuclear Weapons Complex.

  16. Application of RUNTA code in flood analyses

    International Nuclear Information System (INIS)

    Perez Martin, F.; Benitez Fonzalez, F.

    1994-01-01

    Flood probability analyses carried out to date indicate the need to evaluate a large number of flood scenarios. This necessity is due to a variety of reasons, the most important of which include: - Large number of potential flood sources - Wide variety of characteristics of flood sources - Large possibility of flood-affected areas becoming inter linked, depending on the location of the potential flood sources - Diversity of flood flows from one flood source, depending on the size of the rupture and mode of operation - Isolation times applicable - Uncertainties in respect of the structural resistance of doors, penetration seals and floors - Applicable degrees of obstruction of floor drainage system Consequently, a tool which carries out the large number of calculations usually required in flood analyses, with speed and flexibility, is considered necessary. The RUNTA Code enables the range of possible scenarios to be calculated numerically, in accordance with all those parameters which, as a result of previous flood analyses, it is necessary to take into account in order to cover all the possible floods associated with each flood area

  17. An analyser for power plant operations

    International Nuclear Information System (INIS)

    Rogers, A.E.; Wulff, W.

    1990-01-01

    Safe and reliable operation of power plants is essential. Power plant operators need a forecast of what the plant will do when its current state is disturbed. The in-line plant analyser provides precisely this information at relatively low cost. The plant analyser scheme uses a mathematical model of the dynamic behaviour of the plant to establish a numerical simulation. Over a period of time, the simulation is calibrated with measurements from the particular plant in which it is used. The analyser then provides a reference against which to evaluate the plant's current behaviour. It can be used to alert the operator to any atypical excursions or combinations of readings that indicate malfunction or off-normal conditions that, as the Three Mile Island event suggests, are not easily recognised by operators. In a look-ahead mode, it can forecast the behaviour resulting from an intended change in settings or operating conditions. Then, when such changes are made, the plant's behaviour can be tracked against the forecast in order to assure that the plant is behaving as expected. It can be used to investigate malfunctions that have occurred and test possible adjustments in operating procedures. Finally, it can be used to consider how far from the limits of performance the elements of the plant are operating. Then by adjusting settings, the required power can be generated with as little stress as possible on the equipment. (6 figures) (Author)

  18. A review of multivariate analyses in imaging genetics

    Directory of Open Access Journals (Sweden)

    Jingyu eLiu

    2014-03-01

    Full Text Available Recent advances in neuroimaging technology and molecular genetics provide the unique opportunity to investigate genetic influence on the variation of brain attributes. Since the year 2000, when the initial publication on brain imaging and genetics was released, imaging genetics has been a rapidly growing research approach with increasing publications every year. Several reviews have been offered to the research community focusing on various study designs. In addition to study design, analytic tools and their proper implementation are also critical to the success of a study. In this review, we survey recent publications using data from neuroimaging and genetics, focusing on methods capturing multivariate effects accommodating the large number of variables from both imaging data and genetic data. We group the analyses of genetic or genomic data into either a prior driven or data driven approach, including gene-set enrichment analysis, multifactor dimensionality reduction, principal component analysis, independent component analysis (ICA, and clustering. For the analyses of imaging data, ICA and extensions of ICA are the most widely used multivariate methods. Given detailed reviews of multivariate analyses of imaging data available elsewhere, we provide a brief summary here that includes a recently proposed method known as independent vector analysis. Finally, we review methods focused on bridging the imaging and genetic data by establishing multivariate and multiple genotype-phenotype associations, including sparse partial least squares, sparse canonical correlation analysis, sparse reduced rank regression and parallel ICA. These methods are designed to extract latent variables from both genetic and imaging data, which become new genotypes and phenotypes, and the links between the new genotype-phenotype pairs are maximized using different cost functions. The relationship between these methods along with their assumptions, advantages, and

  19. Multicentre evaluation of the new ORTHO VISION® analyser.

    Science.gov (United States)

    Lazarova, E; Scott, Y; van den Bos, A; Wantzin, P; Atugonza, R; Solkar, S; Carpio, N

    2017-10-01

    Implementation of fully automated analysers has become a crucial security step in the blood bank; it reduces human errors, allows standardisation and improves turnaround time (TAT). We aimed at evaluating the ease of use and the efficiency of the ORTHO VISION ® Analyser (VISION) in comparison to the ORTHO AutoVue ® Innova System (AutoVue) in six different laboratories. After initial training and system configuration, VISION was used in parallel to AutoVue following the daily workload, both automates being based on ORTHO BioVue ® System column agglutination technology. Each participating laboratory provided data and scored the training, system configuration, quality control, maintenance and system efficiency. A total of 1049 individual samples were run: 266 forward and reverse grouping and antibody screens with 10 urgent samples, 473 ABD forward grouping and antibody screens with 22 urgent samples, 160 ABD forward grouping, 42 antibody screens and a series of 108 specific case profiles. The VISION instrument was more rapid than the AutoVue with a mean performing test time of 27·9 min compared to 36 min; for various test type comparisons, the TAT data obtained from VISION was shorter than that from AutoVue. Moreover, VISION analysed urgent STAT samples faster. Regarding the ease of use, VISION was intuitive and user friendly. VISION is a robust, reproducible system performing the most types of analytical determinations needed for pre-transfusion testing today, thus accommodating a wide range of clinical needs. VISION brings appreciated new features that could further secure blood transfusions. © 2017 The Authors. Transfusion Medicine published by John Wiley & Sons Ltd on behalf of British Blood Transfusion Society.

  20. Implementing enhanced recovery pathways: a literature review with realist synthesis.

    Science.gov (United States)

    Coxon, Astrid; Nielsen, Karina; Cross, Jane; Fox, Chris

    2017-10-01

    Enhanced Recovery Pathways (ERPs) are an increasingly popular, evidenced-based approach to surgery, designed to improve patient outcomes and reduce costs. Despite evidence demonstrating the benefits of these pathways, implementation and adherence have been inconsistent. Using realist synthesis, this review explored the current literature surrounding the implementation of ERPs in the UK. Knowledge consolidation between authors and consulting with field experts helped to guide the search strategy. Relevant medical and social science databases were searched from 2000 to 2016, as well as a general web search. A total of 17 papers were identified, including original research, reviews, case studies and guideline documents. Full texts were analysed, cross-examined, and data extracted and synthesised. Several implementation strategies were identified, including the contexts in which these operated, the subsequent mechanisms of action that were triggered, and the outcome patterns they produced. Context-Mechanism-Outcome (CMO) configurations were generated, tested, and refined. These were grouped to develop two programme theories concerning ERP implementation, one related to the strategy of consulting with staff, the other with appointing a change agent to coordinate and drive the implementation process. These theories highlight instances in which implementation could be improved. Current literature in ERP research is primarily focussed on measuring patient outcomes and cost effectiveness, and as a result, important detail regarding the implementation process is often not reported or described robustly. This review not only provides recommendations for future improvements in ERP implementation, but also highlights specific areas of focus for furthering ERP implementation research.

  1. Interdisciplinary analysis of the chances of implementation of an energy conservation and air pollution abatement policy. Adverse and favourable conditions for rational energy use in private households and their responsible decision-makers from an economic and social psychological perspective; Interdisziplinaere Analyse der Umsetzungschancen einer Energiespar- und Klimaschutzpolitik. Hemmende und foerdernde Bedingungen der rationellen Energienutzung fuer private Haushalte und ihr Akteursumfeld aus oekonomischer und sozialpsychologischer Perspektive

    Energy Technology Data Exchange (ETDEWEB)

    Hennicke, P.; Jochem, E.; Prose, F.

    1997-08-01

    The investigation attempted an analysis of the causes and remedies of the gap between ecological and climate-relevant requirements on the one hand and public and political action on the other hand. It presents a theoretical concept for an ecologically oriented social restructuring process which will support rational energy use and the appropriate research and development measures more strongly than it does now. While the Federal government intends a 25 % reduction of carbon dioxide emissions by 2005, an 80 % reduction is required on a long-term basis (e.g. by 2050). In view of this, the contribution of the German government is interpreted as a necessary but still insufficient contribution to sustainability. [German] Diese Studie erarbeitet einen Beitrag zur Analyse der Ursachen und Loesungsmoeglichkeiten fuer den Widerspruch zwischen umwelt- und klimarelevanten Handlungsnotwendigkeiten einerseits und den gesellschaftlichen Handlungsdefiziten andererseits. Weitergehend erbringt sie einen theoretisch-konzeptionellen Ansatz zum oekologischen Umbau der Industriegesellschaft. Dadurch wird konkret dazu beigetragen, die Politik zur rationellen Energieanwendung einschliesslich der Forschungs- und Entwicklungspolitik besser als bisher zu unterstuetzen. Ausgegangen wird von dem energie- und klimapolitischen Ziel der Bundesregierung, bis zum Jahr 2005 die CO{sub 2}-Emissionen gegenueber dem Referenzjahr 1990 um 25% zu reduzieren. Langfristig, bis etwa zum Jahr 2050, ist nach der Empfehlung der Klima-Enquete-Kommission eine CO{sub 2}-Reduktion um 80% erforderlich. Dieser Beitrag der Bundesrepublik zum internationalen Klimaschutz kann als notwendige, wenn auch noch nicht hinreichende Voraussetzung fuer eine 'zukunftsfaehige Entwicklung' (sustainability) in der Bundesrepublik interpretiert werden. (orig.)

  2. Implementing health promotion tools in Australian Indigenous primary health care.

    Science.gov (United States)

    Percival, Nikki A; McCalman, Janya; Armit, Christine; O'Donoghue, Lynette; Bainbridge, Roxanne; Rowley, Kevin; Doyle, Joyce; Tsey, Komla

    2018-02-01

    In Australia, significant resources have been invested in producing health promotion best practice guidelines, frameworks and tools (herein referred to as health promotion tools) as a strategy to improve Indigenous health promotion programmes. Yet, there has been very little rigorous implementation research about whether or how health promotion tools are implemented. This paper theorizes the complex processes of health promotion tool implementation in Indigenous comprehensive primary healthcare services. Data were derived from published and grey literature about the development and the implementation of four Indigenous health promotion tools. Tools were theoretically sampled to account for the key implementation types described in the literature. Data were analysed using the grounded-theory methods of coding and constant comparison with construct a theoretical implementation model. An Indigenous Health Promotion Tool Implementation Model was developed. Implementation is a social process, whereby researchers, practitioners and community members collectively interacted in creating culturally responsive health promotion to the common purpose of facilitating empowerment. The implementation of health promotion tools was influenced by the presence of change agents; a commitment to reciprocity and organizational governance and resourcing. The Indigenous Health Promotion Tool Implementation Model assists in explaining how health promotion tools are implemented and the conditions that influence these actions. Rather than simply developing more health promotion tools, our study suggests that continuous investment in developing conditions that support empowering implementation processes are required to maximize the beneficial impacts and effectiveness of health promotion tools. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  3. Evidence of "Implemented Anticipation" in Mathematising by Beginning Modellers

    Science.gov (United States)

    Stillman, Gloria; Brown, Jill P.

    2014-01-01

    Data from open modelling sessions for year 10 and 11 students at an extracurricular modelling event and from a year 9 class participating in a programme of structured modelling of real situations were analysed for evidence of Niss's theoretical construct, "implemented anticipation," during mathematisation. Evidence was found for all…

  4. Empirical Evidence for Niss' "Implemented Anticipation" in Mathematising Realistic Situations

    Science.gov (United States)

    Stillman, Gloria; Brown, Jill P.

    2012-01-01

    Mathematisation of realistic situations is an on-going focus of research. Classroom data from a Year 9 class participating in a program of structured modelling of real situations was analysed for evidence of Niss's theoretical construct, implemented anticipation, during mathematisation. Evidence was found for two of three proposed aspects. In…

  5. Managing Change: the people side of implementing CRM processes

    OpenAIRE

    Hann, David

    2006-01-01

    This report has been produced with the remit of analysing the people side of change management with regard to a Customer Relationship Management (CRM) process implementation at Jupiter Design (Jupiter). An increasing churn of clients and 12- years of growth has made Jupiter realise they must maximise revenues from existing clients. The adoption of a CRM approach has been suggested as a possible solution.

  6. Sustainable tourism development on Curacao - the implementation challenge

    NARCIS (Netherlands)

    Dinica, V.; Brebbia, C.A; Pineda, F.D.

    2006-01-01

    In 1997, a comprehensive policy program for sustainable tourism was adopted by the Netherlands Antilles government. This paper is empirically-oriented and analyses the implementation of two measures of this policy on one of the five islands, Curaçao, for the period 1998-2005. It investigates the

  7. Implementation and effectiveness of antiretroviral therapy in Greenland

    DEFF Research Database (Denmark)

    Lohse, N.; Ladefoged, K.; Obel, N.

    2008-01-01

    Analyses from the Danish HIV Cohort Study showed that, despite comparable economic means and general education of healthcare personnel, antiretroviral treatment of HIV in Greenland began later and has been implemented at a slower pace with lower therapeutic effectiveness than in Denmark. However...

  8. Comparison of elastic and inelastic analyses

    International Nuclear Information System (INIS)

    Ammerman, D.J.; Heinstein, M.W.; Wellman, G.W.

    1992-01-01

    The use of inelastic analysis methods instead of the traditional elastic analysis methods in the design of radioactive material (RAM) transport packagings leads to a better understanding of the response of the package to mechanical loadings. Thus, better assessment of the containment, thermal protection, and shielding integrity of the package after a structure accident event can be made. A more accurate prediction of the package response can lead to enhanced safety and also allow for a more efficient use of materials, possibly leading to a package with higher capacity or lower weight. This paper discusses the advantages and disadvantages of using inelastic analysis in the design of RAM shipping packages. The use of inelastic analysis presents several problems to the package designer. When using inelastic analysis the entire nonlinear response of the material must be known, including the effects of temperature changes and strain rate. Another problem is that there currently is not an acceptance criteria for this type of analysis that is approved by regulatory agencies. Inelastic analysis acceptance criteria based on failure stress, failure strain , or plastic energy density could be developed. For both elastic and inelastic analyses it is also important to include other sources of stress in the analyses, such as fabrication stresses, thermal stresses, stresses from bolt preloading, and contact stresses at material interfaces. Offsetting these added difficulties is the improved knowledge of the package behavior. This allows for incorporation of a more uniform margin of safety, which can result in weight savings and a higher level of confidence in the post-accident configuration of the package. In this paper, comparisons between elastic and inelastic analyses are made for a simple ring structure and for a package to transport a large quantity of RAM by rail (rail cask) with lead gamma shielding to illustrate the differences in the two analysis techniques

  9. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  10. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  11. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  12. Safety analyses for high-temperature reactors

    International Nuclear Information System (INIS)

    Mueller, A.

    1978-01-01

    The safety evaluation of HTRs may be based on the three methods presented here: The licensing procedure, the probabilistic risk analysis, and the damage extent analysis. Thereby all safety aspects - from normal operation to the extreme (hypothetical) accidents - of the HTR are covered. The analyses within the licensing procedure of the HTR-1160 have shown that for normal operation and for the design basis accidents the radiation exposures remain clearly below the maximum permissible levels as prescribed by the radiation protection ordinance, so that no real hazard for the population will avise from them. (orig./RW) [de

  13. Introduction: Analysing Emotion and Theorising Affect

    Directory of Open Access Journals (Sweden)

    Peta Tait

    2016-08-01

    Full Text Available This discussion introduces ideas of emotion and affect for a volume of articles demonstrating the scope of approaches used in their study within the humanities and creative arts. The volume offers multiple perspectives on emotion and affect within 20th-century and 21st-century texts, arts and organisations and their histories. The discussion explains how emotion encompasses the emotions, emotional feeling, sensation and mood and how these can be analysed particularly in relation to literature, art and performance. It briefly summarises concepts of affect theory within recent approaches before introducing the articles.

  14. Applications of neural network to numerical analyses

    International Nuclear Information System (INIS)

    Takeda, Tatsuoki; Fukuhara, Makoto; Ma, Xiao-Feng; Liaqat, Ali

    1999-01-01

    Applications of a multi-layer neural network to numerical analyses are described. We are mainly concerned with the computed tomography and the solution of differential equations. In both cases as the objective functions for the training process of the neural network we employed residuals of the integral equation or the differential equations. This is different from the conventional neural network training where sum of the squared errors of the output values is adopted as the objective function. For model problems both the methods gave satisfactory results and the methods are considered promising for some kind of problems. (author)

  15. Komparativ analyse - Scandinavian Airlines & Norwegian Air Shuttle

    OpenAIRE

    Kallesen, Martin Nystrup; Singh, Ravi Pal; Boesen, Nana Wiaberg

    2017-01-01

    The project is based around a pondering of how that a company the size of Scandinavian Airlines or Norwegian Air Shuttle use their Finances and how they see their external environment. This has led to us researching the relationship between the companies and their finances as well as their external environment, and how they differ in both.To do this we have utilised a myriad of different methods to analyse the companies, including PESTEL, SWOT, TOWS; DCF, risk analysis, Sensitivity, Porter’s ...

  16. The phaco machine: analysing new technology.

    Science.gov (United States)

    Fishkind, William J

    2013-01-01

    The phaco machine is frequently overlooked as the crucial surgical instrument it is. Understanding how to set parameters is initiated by understanding fundamental concepts of machine function. This study analyses the critical concepts of partial occlusion phaco, occlusion phaco and pump technology. In addition, phaco energy categories as well as variations of phaco energy production are explored. Contemporary power modulations and pump controls allow for the enhancement of partial occlusion phacoemulsification. These significant changes in the anterior chamber dynamics produce a balanced environment for phaco; less complications; and improved patient outcomes.

  17. Nuclear analyses of the Pietroasa gold hoard

    International Nuclear Information System (INIS)

    Cojocaru, V.; Besliu, C.

    1999-01-01

    By means of nuclear analyses the concentrations of Au, Ag, Cu, Ir, Os, Pt, Co and Hg were measured in the 12 artifacts of the gold hoard discovered in 1837 at Pietroasa, Buzau country in Romania. The concentrations of the first four elements were used to compare different stylistic groups assumed by historians. Comparisons with gold nuggets from the old Dacian territory and gold Roman imperial coins were also made. A good agreement was found with the oldest hypothesis which considers that the hoard is represented by three styles appropriated mainly by the Goths. (author)

  18. An evaluation of the Olympus "Quickrate" analyser.

    Science.gov (United States)

    Williams, D G; Wood, R J; Worth, H G

    1979-02-01

    The Olympus "Quickrate", a photometer built for both kinetic and end point analysis was evaluated in this laboratory. Aspartate transaminase, lactate dehydrogenase, hydroxybutyrate dehydrogenase, creatine kinase, alkaline phosphatase and gamma glutamyl transpeptidase were measured in the kinetic mode and glucose, urea, total protein, albumin, bilirubin, calcium and iron in the end point mode. Overall, good correlation was observed with routine methodologies and the precision of the methods was acceptable. An electrical evaluation was also performed. In our hands, the instrument proved to be simple to use and gave no trouble. It should prove useful for paediatric and emergency work, and as a back up for other analysers.

  19. User Participation in Pilot Implementation

    DEFF Research Database (Denmark)

    Torkilsheyggi, Arnvør Martinsdóttir á; Hertzum, Morten

    2014-01-01

    Pilot implementations provide users with real-work experiences of how a system will affect their daily work before the design of the system is finalized. On the basis of a pilot implementation of a system for coordinating the transport of patients by hospital porters, we investigate pilot...... implementation as a method for participatory design. We find that to foster participation and learning about user needs a pilot implementation must create a space for reflecting on use, in addition to the space for using the pilot system. The space for reflection must also exist during the activities preparing...... the use of the pilot system because the porters and nurses learned about their needs throughout the pilot implementation, not just during use. Finally, we discuss how the scope and duration of a pilot implementation influence the conditions for participation....

  20. Implementing SO2 Emissions in China

    International Nuclear Information System (INIS)

    Schreifels, J.; Yang, J.

    2003-01-01

    Over the past 10 years, the Chinese State Environmental Protection Administration (SEPA) has actively investigated the potential to use emission trading to reduce sulphur dioxide (SO2) emissions from electricity generators and industrial sources. In 1999, SEPA partnered with the U.S. Environmental Protection Agency (U.S. EPA) to cooperate on a study to assess the feasibility of implementing SO2 emission trading in China. SEPA has also pursued emission trading pilot projects in several cities and provinces. The authors, using information from the feasibility study and pilot projects, introduce the circumstances necessary for SO2 emission trading in China, outline the experience to date, and analyse implementation opportunities and barriers in China. The contents of the paper are: (1) SO2 emission control policies in China; (2) institutional requirements and the basis for introducing SO2 emission trading in China; (3) case studies of emission trading in China; (4) opportunities and barriers to implementing emission trading in China; (5) recommendations to transition from pilot projects to a nationwide SO2 emission trading program; and (6) conclusions and suggestions

  1. Clarkesville Green Infrastructure Implementation Strategy

    Science.gov (United States)

    The report outlines the 2012 technical assistance for Clarkesville, GA to develop a Green Infrastructure Implementation Strategy, which provides the basic building blocks for a green infrastructure plan:

  2. Improving the implementation of marine monitoring in the northeast Atlantic.

    Science.gov (United States)

    Turrell, W R

    2018-03-01

    Marine monitoring in the northeast Atlantic is delivered within identifiable monitoring themes, established through time and defined by the geographical area and policy drivers they serve, the sampling methodologies they use, their assessment methodologies, their funding and governance structures and the people or organisations involved in their implementation. Within a monitoring theme, essential components for effective monitoring are governance, strategy and work plan, sampling protocols, quality assurance, and data and assessment structures. This simple framework is used to analyse two monitoring theme case studies; national ecosystem health monitoring, and regional fish stock monitoring. Such essential component analyses, within marine monitoring themes, can help improve monitoring implementation by identifying gaps and overlaps. Once monitoring themes are recognised, explicitly defined and streamlined, travel towards integrated monitoring may be made easier as the current lack of clarity in thematic marine monitoring implementation is one barrier to integration at both national and regional scales. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  3. Performance of neutron kinetics models for ADS transient analyses

    International Nuclear Information System (INIS)

    Rineiski, A.; Maschek, W.; Rimpault, G.

    2002-01-01

    Within the framework of the SIMMER code development, neutron kinetics models for simulating transients and hypothetical accidents in advanced reactor systems, in particular in Accelerator Driven Systems (ADSs), have been developed at FZK/IKET in cooperation with CE Cadarache. SIMMER is a fluid-dynamics/thermal-hydraulics code, coupled with a structure model and a space-, time- and energy-dependent neutronics module for analyzing transients and accidents. The advanced kinetics models have also been implemented into KIN3D, a module of the VARIANT/TGV code (stand-alone neutron kinetics) for broadening application and for testing and benchmarking. In the paper, a short review of the SIMMER and KIN3D neutron kinetics models is given. Some typical transients related to ADS perturbations are analyzed. The general models of SIMMER and KIN3D are compared with more simple techniques developed in the context of this work to get a better understanding of the specifics of transients in subcritical systems and to estimate the performance of different kinetics options. These comparisons may also help in elaborating new kinetics models and extending existing computation tools for ADS transient analyses. The traditional point-kinetics model may give rather inaccurate transient reaction rate distributions in an ADS even if the material configuration does not change significantly. This inaccuracy is not related to the problem of choosing a 'right' weighting function: the point-kinetics model with any weighting function cannot take into account pronounced flux shape variations related to possible significant changes in the criticality level or to fast beam trips. To improve the accuracy of the point-kinetics option for slow transients, we have introduced a correction factor technique. The related analyses give a better understanding of 'long-timescale' kinetics phenomena in the subcritical domain and help to evaluate the performance of the quasi-static scheme in a particular case. One

  4. Analyses of containment structures with corrosion damage

    International Nuclear Information System (INIS)

    Cherry, J.L.

    1997-01-01

    Corrosion damage that has been found in a number of nuclear power plant containment structures can degrade the pressure capacity of the vessel. This has prompted concerns regarding the capacity of corroded containments to withstand accident loadings. To address these concerns, finite element analyses have been performed for a typical PWR Ice Condenser containment structure. Using ABAQUS, the pressure capacity was calculated for a typical vessel with no corrosion damage. Multiple analyses were then performed with the location of the corrosion and the amount of corrosion varied in each analysis. Using a strain-based failure criterion, a open-quotes lower boundclose quotes, open-quotes best estimateclose quotes, and open-quotes upper boundclose quotes failure level was predicted for each case. These limits were established by: determining the amount of variability that exists in material properties of typical containments, estimating the amount of uncertainty associated with the level of modeling detail and modeling assumptions, and estimating the effect of corrosion on the material properties

  5. Passive safety injection experiments and analyses (PAHKO)

    International Nuclear Information System (INIS)

    Tuunanen, J.

    1998-01-01

    PAHKO project involved experiments on the PACTEL facility and computer simulations of selected experiments. The experiments focused on the performance of Passive Safety Injection Systems (PSIS) of Advanced Light Water Reactors (ALWRs) in Small Break Loss-Of-Coolant Accident (SBLOCA) conditions. The PSIS consisted of a Core Make-up Tank (CMT) and two pipelines (Pressure Balancing Line, PBL, and Injection Line, IL). The examined PSIS worked efficiently in SBLOCAs although the flow through the PSIS stopped temporarily if the break was very small and the hot water filled the CMT. The experiments demonstrated the importance of the flow distributor in the CMT to limit rapid condensation. The project included validation of three thermal-hydraulic computer codes (APROS, CATHARE and RELAP5). The analyses showed the codes are capable to simulate the overall behaviour of the transients. The detailed analyses of the results showed some models in the codes still need improvements. Especially, further development of models for thermal stratification, condensation and natural circulation flow with small driving forces would be necessary for accurate simulation of the PSIS phenomena. (orig.)

  6. Used Fuel Management System Interface Analyses - 13578

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Robert; Busch, Ingrid [Oak Ridge National Laboratory, P.O. Box 2008, Bldg. 5700, MS-6170, Oak Ridge, TN 37831 (United States); Nutt, Mark; Morris, Edgar; Puig, Francesc [Argonne National Laboratory (United States); Carter, Joe; Delley, Alexcia; Rodwell, Phillip [Savannah River National Laboratory (United States); Hardin, Ernest; Kalinina, Elena [Sandia National Laboratories (United States); Clark, Robert [U.S. Department of Energy (United States); Cotton, Thomas [Complex Systems Group (United States)

    2013-07-01

    Preliminary system-level analyses of the interfaces between at-reactor used fuel management, consolidated storage facilities, and disposal facilities, along with the development of supporting logistics simulation tools, have been initiated to provide the U.S. Department of Energy (DOE) and other stakeholders with information regarding the various alternatives for managing used nuclear fuel (UNF) generated by the current fleet of light water reactors operating in the United States. An important UNF management system interface consideration is the need for ultimate disposal of UNF assemblies contained in waste packages that are sized to be compatible with different geologic media. Thermal analyses indicate that waste package sizes for the geologic media under consideration by the Used Fuel Disposition Campaign may be significantly smaller than the canisters being used for on-site dry storage by the nuclear utilities. Therefore, at some point along the UNF disposition pathway, there could be a need to repackage fuel assemblies already loaded and being loaded into the dry storage canisters currently in use. The implications of where and when the packaging or repackaging of commercial UNF will occur are key questions being addressed in this evaluation. The analysis demonstrated that thermal considerations will have a major impact on the operation of the system and that acceptance priority, rates, and facility start dates have significant system implications. (authors)

  7. Sensitivity in risk analyses with uncertain numbers.

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, W. Troy; Ferson, Scott

    2006-06-01

    Sensitivity analysis is a study of how changes in the inputs to a model influence the results of the model. Many techniques have recently been proposed for use when the model is probabilistic. This report considers the related problem of sensitivity analysis when the model includes uncertain numbers that can involve both aleatory and epistemic uncertainty and the method of calculation is Dempster-Shafer evidence theory or probability bounds analysis. Some traditional methods for sensitivity analysis generalize directly for use with uncertain numbers, but, in some respects, sensitivity analysis for these analyses differs from traditional deterministic or probabilistic sensitivity analyses. A case study of a dike reliability assessment illustrates several methods of sensitivity analysis, including traditional probabilistic assessment, local derivatives, and a ''pinching'' strategy that hypothetically reduces the epistemic uncertainty or aleatory uncertainty, or both, in an input variable to estimate the reduction of uncertainty in the outputs. The prospects for applying the methods to black box models are also considered.

  8. Fractal and multifractal analyses of bipartite networks

    Science.gov (United States)

    Liu, Jin-Long; Wang, Jian; Yu, Zu-Guo; Xie, Xian-Hua

    2017-03-01

    Bipartite networks have attracted considerable interest in various fields. Fractality and multifractality of unipartite (classical) networks have been studied in recent years, but there is no work to study these properties of bipartite networks. In this paper, we try to unfold the self-similarity structure of bipartite networks by performing the fractal and multifractal analyses for a variety of real-world bipartite network data sets and models. First, we find the fractality in some bipartite networks, including the CiteULike, Netflix, MovieLens (ml-20m), Delicious data sets and (u, v)-flower model. Meanwhile, we observe the shifted power-law or exponential behavior in other several networks. We then focus on the multifractal properties of bipartite networks. Our results indicate that the multifractality exists in those bipartite networks possessing fractality. To capture the inherent attribute of bipartite network with two types different nodes, we give the different weights for the nodes of different classes, and show the existence of multifractality in these node-weighted bipartite networks. In addition, for the data sets with ratings, we modify the two existing algorithms for fractal and multifractal analyses of edge-weighted unipartite networks to study the self-similarity of the corresponding edge-weighted bipartite networks. The results show that our modified algorithms are feasible and can effectively uncover the self-similarity structure of these edge-weighted bipartite networks and their corresponding node-weighted versions.

  9. Special analyses reveal coke-deposit structure

    International Nuclear Information System (INIS)

    Albright, L.F.

    1988-01-01

    A scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX) have been used to obtain information that clarifies the three mechanisms of coke formation in ethylene furnaces, and to analyze the metal condition at the exit of furnace. The results can be used to examine furnace operations and develop improved ethylene plant practices. In this first of four articles on the analyses of coke and metal samples, the coking mechanisms and coke deposits in a section of tube from an actual ethylene furnace (Furnace A) from a plant on the Texas Gulf Coast are discussed. The second articles in the series will analyze the condition of the tube metal in the same furnace. To show how coke deposition and metal condition dependent on the operating parameters of an ethylene furnace, the third article in the series will show the coke deposition in a Texas Gulf Coast furnace tube (Furnace B) that operated at shorter residence time. The fourth article discusses the metal condition in that furnace. Some recommendations, based on the analyses and findings, are offered in the fourth article that could help extend the life of ethylene furnace tubes, and also improve overall ethylene plant operations

  10. Overview of cooperative international piping benchmark analyses

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1982-01-01

    This paper presents an overview of an effort initiated in 1976 by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency (IAEA) to evaluate detailed and simplified inelastic analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IAEA countries descriptions of tests and test results for piping systems or bends (with emphasis on high temperature inelastic tests), to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analyses results. Of the problem descriptions submitted three were selected to be used: a 90 0 -elbow at 600 0 C with an in-plane transverse force; a 90 0 -elbow with an in-plane moment; and a 180 0 -elbow at room temperature with a reversed, cyclic, in-plane transverse force. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this paper. 15 figures

  11. Ethics of cost analyses in medical education.

    Science.gov (United States)

    Walsh, Kieran

    2013-11-01

    Cost analyses in medical education are rarely straightforward, and rarely lead to clear-cut conclusions. Occasionally they do lead to clear conclusions but even when that happens, some stakeholders will ask difficult but valid questions about what to do following cost analyses-specifically about distributive justice in the allocation of resources. At present there are few or no debates about these issues and rationing decisions that are taken in medical education are largely made subconsciously. Distributive justice 'concerns the nature of a socially just allocation of goods in a society'. Inevitably there is a large degree of subjectivity in the judgment as to whether an allocation is seen as socially just or ethical. There are different principles by which we can view distributive justice and which therefore affect the prism of subjectivity through which we see certain problems. For example, we might say that distributive justice at a certain institution or in a certain medical education system operates according to the principle that resources must be divided equally amongst learners. Another system may say that resources should be distributed according to the needs of learners or even of patients. No ethical system or model is inherently right or wrong, they depend on the context in which the educator is working.

  12. CIEMAT analyses of transition fuel cycle scenarios

    International Nuclear Information System (INIS)

    Alvarez-Velarde, F.; Gonzalez-Romero, E.M.

    2010-01-01

    The efficient design of strategies for the long-term sustainability of nuclear energy or the phase-out of this technology is possible after the study of transition scenarios from the current fuel cycle to a future one with advanced technologies and concepts. CIEMAT has participated in numerous fuel cycle scenarios studies for more than a decade and, from some years ago, special attention has been put in the study of transition scenarios. In this paper, the main characteristics of each studied transition scenario are described. The main results and partial conclusions of each scenario are also analyzed. As general conclusions of transition studies, we highlight that the advantages of advanced technologies in transition scenarios can be obtained by countries or regions with sufficiently large nuclear parks, with a long-term implementation of the strategy. For small countries, these advantages are also accessible with an affordable cost, by means of the regional collaboration during several decades. (authors)

  13. International comparative analyses of healthcare risk management.

    Science.gov (United States)

    Sun, Niuyun; Wang, Li; Zhou, Jun; Yuan, Qiang; Zhang, Zongjiu; Li, Youping; Liang, Minghui; Cheng, Lan; Gao, Guangming; Cui, Xiaohui

    2011-02-01

    Interpretation of the growing body of global literature on health care risk is compromised by a lack of common understanding and language. This series of articles aims to comprehensively compare laws and regulations, institutional management, and administration of incidence reporting systems on medical risk management in the United Kingdom, the United States, Canada, Australia, and Taiwan, so as to provide evidence and recommendations for health care risk management policy in China. We searched the official websites of the healthcare risk management agencies of the four countries and one district for laws, regulatory documents, research reports, reviews and evaluation forms concerned with healthcare risk management and assessment. Descriptive comparative analysis was performed on relevant documents. A total of 146 documents were included in this study, including 2 laws (1.4%), 17 policy documents (11.6%), 41 guidance documents (28.1%), 37 reviews (25.3%), and 49 documents giving general information (33.6%). The United States government implemented one law and one rule of patient safety management, while the United Kingdom and Australia each issued professional guidances on patient safety improvement. The four countries implemented patient safety management policy on four different levels: national, state/province, hospital, and non-governmental organization. The four countries and one district adopted four levels of patient safety management, and the administration modes can be divided into an "NGO-led mode" represented by the United States and Canada and a "government-led mode" represented by the United Kingdom, Australia, and Taiwan. © 2011 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.

  14. The role of Stakeholders on implementing Universal Services in Vietnam

    DEFF Research Database (Denmark)

    Do Manh, Thai; Falch, Morten; Williams, Idongesit

    2015-01-01

    This paper looks at the universal services policy in Vietnam (interval 2005-2010) via analysing stakeholders in order to clarify how they exerted influence and how they implemented the policy. The stakeholder theory is employed to identify and categorize the stakeholders who participated in perfo......This paper looks at the universal services policy in Vietnam (interval 2005-2010) via analysing stakeholders in order to clarify how they exerted influence and how they implemented the policy. The stakeholder theory is employed to identify and categorize the stakeholders who participated...... in performing the policy. The authors are to examine the stakeholders such as the national government, international organizations, policy intermediaries, companies, and customers/citizens via applying the qualitative method to gather data and analyse the secondary document. The qualitative approach...... of interviews on some officials was also conducted. The results demonstrate that stakeholders had a huge impact on the success of the universal service policy....

  15. Design and Implementation of Company Tailored Automated Material Handling

    DEFF Research Database (Denmark)

    Langer, Gilad; Bilberg, Arne

    1996-01-01

    This article focuses on the problems of analysing automation of material handling systems in order to develop an efficient automated solution that is specifically tailored to the company. The research has resulted in development of new methods for evaluating factory automation from design...... to implementation. The goals of the research were to analyse and evaluate automation in order to obtain an advantageous combination of human and automated resources. The idea is to asses different solutions in a virtual environment, where experiments and analyses can be performed so that the company can justify...... for their application with computer aided information processing tools. The framework is named the "Automated Material Handling (AMH) Preference GuideLine". The research has been carried out in close co-operation with Danish and European industry, where implementations of automation can be referred to. It is our...

  16. Implementation and Rejection of Industrial Steam System Energy Efficiency Measures

    Energy Technology Data Exchange (ETDEWEB)

    Therkelesen, Peter [Environmental Energy Technologies Division Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); McKane, Aimee [Environmental Energy Technologies Division Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2013-05-01

    Steam systems consume approximately one third of energy applied at U.S. industrial facilities. To reduce energy consumption, steam system energy assessments have been conducted on a wide range of industry types over the course of five years through the Energy Savings Assessment (ESA) program administered by the U.S. Department of Energy (U.S. DOE). ESA energy assessments result in energy efficiency measure recommendations that are given potential energy and energy cost savings and potential implementation cost values. Saving and cost metrics that measure the impact recommended measures will have at facilities, described as percentages of facility baseline energy and energy cost, are developed from ESA data and used in analyses. Developed savings and cost metrics are examined along with implementation and rejection rates of recommended steam system energy efficiency measures. Based on analyses, implementation of steam system energy efficiency measures is driven primarily by cost metrics: payback period and measure implementation cost as a percentage of facility baseline energy cost (implementation cost percentage). Stated reasons for rejecting recommended measures are primarily based upon economic concerns. Additionally, implementation rates of measures are not only functions of savings and cost metrics, but time as well.

  17. Pathway analyses implicate glial cells in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Laramie E Duncan

    Full Text Available The quest to understand the neurobiology of schizophrenia and bipolar disorder is ongoing with multiple lines of evidence indicating abnormalities of glia, mitochondria, and glutamate in both disorders. Despite high heritability estimates of 81% for schizophrenia and 75% for bipolar disorder, compelling links between findings from neurobiological studies, and findings from large-scale genetic analyses, are only beginning to emerge.Ten publically available gene sets (pathways related to glia, mitochondria, and glutamate were tested for association to schizophrenia and bipolar disorder using MAGENTA as the primary analysis method. To determine the robustness of associations, secondary analyses were performed with: ALIGATOR, INRICH, and Set Screen. Data from the Psychiatric Genomics Consortium (PGC were used for all analyses. There were 1,068,286 SNP-level p-values for schizophrenia (9,394 cases/12,462 controls, and 2,088,878 SNP-level p-values for bipolar disorder (7,481 cases/9,250 controls.The Glia-Oligodendrocyte pathway was associated with schizophrenia, after correction for multiple tests, according to primary analysis (MAGENTA p = 0.0005, 75% requirement for individual gene significance and also achieved nominal levels of significance with INRICH (p = 0.0057 and ALIGATOR (p = 0.022. For bipolar disorder, Set Screen yielded nominally and method-wide significant associations to all three glial pathways, with strongest association to the Glia-Astrocyte pathway (p = 0.002.Consistent with findings of white matter abnormalities in schizophrenia by other methods of study, the Glia-Oligodendrocyte pathway was associated with schizophrenia in our genomic study. These findings suggest that the abnormalities of myelination observed in schizophrenia are at least in part due to inherited factors, contrasted with the alternative of purely environmental causes (e.g. medication effects or lifestyle. While not the primary purpose of our study

  18. DEPUTY: analysing architectural structures and checking style

    International Nuclear Information System (INIS)

    Gorshkov, D.; Kochelev, S.; Kotegov, S.; Pavlov, I.; Pravilnikov, V.; Wellisch, J.P.

    2001-01-01

    The DepUty (dependencies utility) can be classified as a project and process management tool. The main goal of DepUty is to assist by means of source code analysis and graphical representation using UML, in understanding dependencies of sub-systems and packages in CMS Object Oriented software, to understand architectural structure, and to schedule code release in modularised integration. It also allows a new-comer to more easily understand the global structure of CMS software, and to void circular dependencies up-front or re-factor the code, in case it was already too close to the edge of non-maintainability. The authors will discuss the various views DepUty provides to analyse package dependencies, and illustrate both the metrics and style checking facilities it provides

  19. Response surface use in safety analyses

    International Nuclear Information System (INIS)

    Prosek, A.

    1999-01-01

    When thousands of complex computer code runs related to nuclear safety are needed for statistical analysis, the response surface is used to replace the computer code. The main purpose of the study was to develop and demonstrate a tool called optimal statistical estimator (OSE) intended for response surface generation of complex and non-linear phenomena. The performance of optimal statistical estimator was tested by the results of 59 different RELAP5/MOD3.2 code calculations of the small-break loss-of-coolant accident in a two loop pressurized water reactor. The results showed that OSE adequately predicted the response surface for the peak cladding temperature. Some good characteristic of the OSE like monotonic function between two neighbor points and independence on the number of output parameters suggest that OSE can be used for response surface generation of any safety or system parameter in the thermal-hydraulic safety analyses.(author)

  20. Spatial Analyses of Harappan Urban Settlements

    Directory of Open Access Journals (Sweden)

    Hirofumi Teramura

    2006-12-01

    Full Text Available The Harappan Civilization occupies a unique place among the early civilizations of the world with its well planned urban settlements, advanced handicraft and technology, religious and trade activities. Using a Geographical Information Systems (GIS, this study presents spatial analyses that locate urban settlements on a digital elevation model (DEM according to the three phases of early, mature and late. Understanding the relationship between the spatial distribution of Harappan sites and the change in some factors, such as topographic features, river passages or sea level changes, will lead to an understanding of the dynamism of this civilization. It will also afford a glimpse of the factors behind the formation, development, and decline of the Harappan Civilization.

  1. The plant design analyser and its applications

    International Nuclear Information System (INIS)

    Whitmarsh-Everiss, M.J.

    1992-01-01

    Consideration is given to the history of computational methods for the non-linear dynamic analysis of plant behaviour. This is traced from analogue to hybrid computers. When these were phased out simulation languages were used in the batch mode and the interactive computational capabilities were lost. These have subsequently been recovered using mainframe computing architecture in the context of small models using the Prototype Plant Design Analyser. Given the development of parallel processing architectures, the restriction on model size can be lifted. This capability and the use of advanced Work Stations and graphics software has enabled an advanced interactive design environment to be developed. This system is generic and can be used, with suitable graphics development, to study the dynamics and control behaviour of any plant or system for minimum cost. Examples of past and possible future uses are identified. (author)

  2. Abundance analyses of thirty cool carbon stars

    International Nuclear Information System (INIS)

    Utsumi, Kazuhiko

    1985-01-01

    The results were previously obtained by use of the absolute gf-values and the cosmic abundance as a standard. These gf-values were found to contain large systematic errors, and as a result, the solar photospheric abundances were revised. Our previous results, therefore, must be revised by using new gf-values, and abundance analyses are extended for as many carbon stars as possible. In conclusion, in normal cool carbon stars heavy metals are overabundant by factors of 10 - 100 and rare-earth elements are overabundant by a factor of about 10, and in J-type cool carbon stars, C 12 /C 13 ratio is smaller, C 2 and CN bands and Li 6708 are stronger than in normal cool carbon stars, and the abundances of s-process elements with respect to Fe are nearly normal. (Mori, K.)

  3. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  4. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  5. Precise Chemical Analyses of Planetary Surfaces

    Science.gov (United States)

    Kring, David; Schweitzer, Jeffrey; Meyer, Charles; Trombka, Jacob; Freund, Friedemann; Economou, Thanasis; Yen, Albert; Kim, Soon Sam; Treiman, Allan H.; Blake, David; hide

    1996-01-01

    We identify the chemical elements and element ratios that should be analyzed to address many of the issues identified by the Committee on Planetary and Lunar Exploration (COMPLEX). We determined that most of these issues require two sensitive instruments to analyze the necessary complement of elements. In addition, it is useful in many cases to use one instrument to analyze the outermost planetary surface (e.g. to determine weathering effects), while a second is used to analyze a subsurface volume of material (e.g., to determine the composition of unaltered planetary surface material). This dual approach to chemical analyses will also facilitate the calibration of orbital and/or Earth-based spectral observations of the planetary body. We determined that in many cases the scientific issues defined by COMPLEX can only be fully addressed with combined packages of instruments that would supplement the chemical data with mineralogic or visual information.

  6. Seismic analyses of structures. 1st draft

    International Nuclear Information System (INIS)

    David, M.

    1995-01-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as response to seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration

  7. Analysing Terrorism from a Systems Thinking Perspective

    Directory of Open Access Journals (Sweden)

    Lukas Schoenenberger

    2014-02-01

    Full Text Available Given the complexity of terrorism, solutions based on single factors are destined to fail. Systems thinking offers various tools for helping researchers and policy makers comprehend terrorism in its entirety. We have developed a semi-quantitative systems thinking approach for characterising relationships between variables critical to terrorism and their impact on the system as a whole. For a better understanding of the mechanisms underlying terrorism, we present a 16-variable model characterising the critical components of terrorism and perform a series of highly focused analyses. We show how to determine which variables are best suited for government intervention, describing in detail their effects on the key variable—the political influence of a terrorist network. We also offer insights into how to elicit variables that destabilise and ultimately break down these networks. Because we clarify our novel approach with fictional data, the primary importance of this paper lies in the new framework for reasoning that it provides.

  8. Seismic analyses of structures. 1st draft

    Energy Technology Data Exchange (ETDEWEB)

    David, M [David Consulting, Engineering and Design Office (Czech Republic)

    1995-07-01

    The dynamic analysis presented in this paper refers to the seismic analysis of the main building of Paks NPP. The aim of the analysis was to determine the floor response spectra as responseto seismic input. This analysis was performed by the 3-dimensional calculation model and the floor response spectra were determined for a number levels from the floor response time histories and no other adjustments were applied. The following results of seismic analysis are presented: 3-dimensional finite element model; basic assumptions of dynamic analyses; table of frequencies and included factors; modal masses for all modes; floor response spectra in all the selected nodes with figures of indicated nodes and important nodes of free vibration.

  9. Project analysis and integration economic analyses summary

    Science.gov (United States)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  10. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  11. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  12. ATLAS helicity analyses in beauty hadron decays

    CERN Document Server

    Smizanska, M

    2000-01-01

    The ATLAS detector will allow a precise spatial reconstruction of the kinematics of B hadron decays. In combination with the efficient lepton identification applied already at trigger level, ATLAS is expected to provide large samples of exclusive decay channels cleanly separable from background. These data sets will allow spin-dependent analyses leading to the determination of production and decay parameters, which are not accessible if the helicity amplitudes are not separated. Measurement feasibility studies for decays B/sub s //sup 0/ to J/ psi phi and Lambda /sub b//sup 0/ to Lambda J/ psi , presented in this document, show the experimental precisions that can be achieved in determination of B/sub s//sup 0/ and Lambda /sub b //sup 0/ characteristics. (19 refs).

  13. Thermal hydraulic reactor safety analyses and experiments

    International Nuclear Information System (INIS)

    Holmstroem, H.; Eerikaeinen, L.; Kervinen, T.; Kilpi, K.; Mattila, L.; Miettinen, J.; Yrjoelae, V.

    1989-04-01

    The report introduces the results of the thermal hydraulic reactor safety research performed in the Nuclear Engineering Laboratory of the Technical Research Centre of Finland (VTT) during the years 1972-1987. Also practical applications i.e. analyses for the safety authorities and power companies are presented. The emphasis is on description of the state-of-the-art know how. The report describes VTT's most important computer codes, both those of foreign origin and those developed at VTT, and their assessment work, VTT's own experimental research, as well as international experimental projects and other forms of cooperation VTT has participated in. Appendix 8 contains a comprehensive list of the most important publications and technical reports produced. They present the content and results of the research in detail.(orig.)

  14. Digital analyses of cartometric Fruska Gora guidelines

    Directory of Open Access Journals (Sweden)

    Živković Dragica

    2013-01-01

    Full Text Available Modern geo morphological topography research have been using quantity statistic and cartographic methods for topographic relief features, mutual relief features, mutual connection analyses on the grounds of good quality numeric parameters etc. Topographic features are important for topographic activities are important for important natural activities. Important morphological characteristics are precisely at the angle of topography, hypsometry, and topography exposition and so on. Small yet unknown relief slants can deeply affect land configuration, hypsometry, topographic exposition etc. Expositions modify the light and heat of interconnected phenomena: soil and air temperature, soil disintegration, the length of vegetation period, the complexity of photosynthesis, the fruitfulness of agricultural crops, the height of snow limit etc. [Projekat Ministarstva nauke Republike Srbije, br. 176008 i br. III44006

  15. Attitude stability analyses for small artificial satellites

    International Nuclear Information System (INIS)

    Silva, W R; Zanardi, M C; Formiga, J K S; Cabette, R E S; Stuchi, T J

    2013-01-01

    The objective of this paper is to analyze the stability of the rotational motion of a symmetrical spacecraft, in a circular orbit. The equilibrium points and regions of stability are established when components of the gravity gradient torque acting on the spacecraft are included in the equations of rotational motion, which are described by the Andoyer's variables. The nonlinear stability of the equilibrium points of the rotational motion is analysed here by the Kovalev-Savchenko theorem. With the application of the Kovalev-Savchenko theorem, it is possible to verify if they remain stable under the influence of the terms of higher order of the normal Hamiltonian. In this paper, numerical simulations are made for a small hypothetical artificial satellite. Several stable equilibrium points were determined and regions around these points have been established by variations in the orbital inclination and in the spacecraft principal moment of inertia. The present analysis can directly contribute in the maintenance of the spacecraft's attitude

  16. Cointegration Approach to Analysing Inflation in Croatia

    Directory of Open Access Journals (Sweden)

    Lena Malešević-Perović

    2009-06-01

    Full Text Available The aim of this paper is to analyse the determinants of inflation in Croatia in the period 1994:6-2006:6. We use a cointegration approach and find that increases in wages positively influence inflation in the long-run. Furthermore, in the period from June 1994 onward, the depreciation of the currency also contributed to inflation. Money does not explain Croatian inflation. This irrelevance of the money supply is consistent with its endogeneity to exchange rate targeting, whereby the money supply is determined by developments in the foreign exchange market. The value of inflation in the previous period is also found to be significant, thus indicating some inflation inertia.

  17. Comprehensive immunoproteogenomic analyses of malignant pleural mesothelioma.

    Science.gov (United States)

    Lee, Hyun-Sung; Jang, Hee-Jin; Choi, Jong Min; Zhang, Jun; de Rosen, Veronica Lenge; Wheeler, Thomas M; Lee, Ju-Seog; Tu, Thuydung; Jindra, Peter T; Kerman, Ronald H; Jung, Sung Yun; Kheradmand, Farrah; Sugarbaker, David J; Burt, Bryan M

    2018-04-05

    We generated a comprehensive atlas of the immunologic cellular networks within human malignant pleural mesothelioma (MPM) using mass cytometry. Data-driven analyses of these high-resolution single-cell data identified 2 distinct immunologic subtypes of MPM with vastly different cellular composition, activation states, and immunologic function; mass spectrometry demonstrated differential abundance of MHC-I and -II neopeptides directly identified between these subtypes. The clinical relevance of this immunologic subtyping was investigated with a discriminatory molecular signature derived through comparison of the proteomes and transcriptomes of these 2 immunologic MPM subtypes. This molecular signature, representative of a favorable intratumoral cell network, was independently associated with improved survival in MPM and predicted response to immune checkpoint inhibitors in patients with MPM and melanoma. These data additionally suggest a potentially novel mechanism of response to checkpoint blockade: requirement for high measured abundance of neopeptides in the presence of high expression of MHC proteins specific for these neopeptides.

  18. Deterministic analyses of severe accident issues

    International Nuclear Information System (INIS)

    Dua, S.S.; Moody, F.J.; Muralidharan, R.; Claassen, L.B.

    2004-01-01

    Severe accidents in light water reactors involve complex physical phenomena. In the past there has been a heavy reliance on simple assumptions regarding physical phenomena alongside of probability methods to evaluate risks associated with severe accidents. Recently GE has developed realistic methodologies that permit deterministic evaluations of severe accident progression and of some of the associated phenomena in the case of Boiling Water Reactors (BWRs). These deterministic analyses indicate that with appropriate system modifications, and operator actions, core damage can be prevented in most cases. Furthermore, in cases where core-melt is postulated, containment failure can either be prevented or significantly delayed to allow sufficient time for recovery actions to mitigate severe accidents

  19. Policy Implementation: Implications for Evaluation

    Science.gov (United States)

    DeGroff, Amy; Cargo, Margaret

    2009-01-01

    Policy implementation reflects a complex change process where government decisions are transformed into programs, procedures, regulations, or practices aimed at social betterment. Three factors affecting contemporary implementation processes are explored: networked governance, sociopolitical context and the democratic turn, and new public…

  20. COMPARATIVE ANALYSIS AND IMPLEMENTATION OF ...

    African Journals Online (AJOL)

    TransRoute: a web-based vehicle route planning application is proposed in this paper. This application leverages existing input-output (I/O) efficient implementations of shortest path algorithms (SPAs) to implement the proposed system that will fundamentally address the problems experienced in moving people, goods and ...

  1. Developing frameworks for protocol implementation

    NARCIS (Netherlands)

    de Barros Barbosa, C.; de barros Barbosa, C.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a method to develop frameworks for protocol implementation. Frameworks are software structures developed for a specific application domain, which can be reused in the implementation of various different concrete systems in this domain. The use of frameworks support a protocol

  2. Implementation Guide: Leading School Change

    Science.gov (United States)

    Whitaker, Todd

    2010-01-01

    This two-part "Implementation Guide" will help to deepen your understanding and sharpen your ability to implement each of the strategies discussed in "Leading School Change: Nine Strategies to Bring Everybody on Board" (ED509821). Part One offers discussion questions and activities which focus on each of the nine strategies. They can be completed…

  3. An Implementation of Bigraph Matching

    DEFF Research Database (Denmark)

    Glenstrup, Arne John; Damgaard, Troels Christoffer; Birkedal, Lars

    We describe a provably sound and complete matching algorithm for bigraphical reactive systems. The algorithm has been implemented in our BPL Tool, a first implementation of bigraphical reactive systems. We describe the tool and present a concrete example of how it can be used to simulate a model...

  4. Global alignment algorithms implementations | Fatumo ...

    African Journals Online (AJOL)

    In this paper, we implemented the two routes for sequence comparison, that is; the dotplot and Needleman-wunsch algorithm for global sequence alignment. Our algorithms were implemented in python programming language and were tested on Linux platform 1.60GHz, 512 MB of RAM SUSE 9.2 and 10.1 versions.

  5. Risques naturels en montagne et analyse spatiale

    Directory of Open Access Journals (Sweden)

    Yannick Manche

    1999-06-01

    Full Text Available Le concept de risque repose sur deux notions :l'aléa, qui représente le phénomène physique par son amplitude et sa période retour ;la vulnérabilité, qui représente l'ensemble des biens et des personnes pouvant être touchés par un phénomène naturel.Le risque se définit alors comme le croisement de ces deux notions. Cette vision théorique permet de modéliser indépendamment les aléas et la vulnérabilité.Ce travail s'intéresse essentiellement à la prise en compte de la vulnérabilité dans la gestion des risques naturels. Son évaluation passe obligatoirement par une certaine analyse spatiale qui prend en compte l'occupation humaine et différentes échelles de l'utilisation de l'espace. Mais l'évaluation spatiale, que ce soit des biens et des personnes, ou des effets indirects se heurte à de nombreux problèmes. Il faut estimer l'importance de l'occupation de l'espace. Par ailleurs, le traitement des données implique des changements constants d'échelle pour passer des éléments ponctuels aux surfaces, ce que les systèmes d'information géographique ne gèrent pas parfaitement. La gestion des risques entraîne de fortes contraintes d'urbanisme, la prise en compte de la vulnérabilité permet de mieux comprendre et gérer les contraintes spatiales qu'impliquent les risques naturels. aléa, analyse spatiale, risques naturels, S.I.G., vulnérabilité

  6. Isotropy analyses of the Planck convergence map

    Science.gov (United States)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  7. The radiation analyses of ITER lower ports

    International Nuclear Information System (INIS)

    Petrizzi, L.; Brolatti, G.; Martin, A.; Loughlin, M.; Moro, F.; Villari, R.

    2010-01-01

    The ITER Vacuum Vessel has upper, equatorial, and lower ports used for equipment installation, diagnostics, heating and current drive systems, cryo-vacuum pumping, and access inside the vessel for maintenance. At the level of the divertor, the nine lower ports for remote handling, cryo-vacuum pumping and diagnostic are inclined downwards and toroidally located each every 40 o . The cryopump port has additionally a branch to allocate a second cryopump. The ports, as openings in the Vacuum Vessel, permit radiation streaming out of the vessel which affects the heating in the components in the outer regions of the machine inside and outside the ports. Safety concerns are also raised with respect to the dose after shutdown at the cryostat behind the ports: in such zones the radiation dose level must be kept below the regulatory limit to allow personnel access for maintenance purposes. Neutronic analyses have been required to qualify the ITER project related to the lower ports. A 3-D model was used to take into account full details of the ports and the lower machine surroundings. MCNP version 5 1.40 has been used with the FENDL 2.1 nuclear data library. The ITER 40 o model distributed by the ITER Organization was developed in the lower part to include the relevant details. The results of a first analysis, focused on cryopump system only, were recently published. In this paper more complete data on the cryopump port and analysis for the remote handling port and the diagnostic rack are presented; the results of both analyses give a complete map of the radiation loads in the outer divertor ports. Nuclear heating, dpa, tritium production, and dose rates after shutdown are provided and the implications for the design are discussed.

  8. Architecture Level Safety Analyses for Safety-Critical Systems

    Directory of Open Access Journals (Sweden)

    K. S. Kushal

    2017-01-01

    Full Text Available The dependency of complex embedded Safety-Critical Systems across Avionics and Aerospace domains on their underlying software and hardware components has gradually increased with progression in time. Such application domain systems are developed based on a complex integrated architecture, which is modular in nature. Engineering practices assured with system safety standards to manage the failure, faulty, and unsafe operational conditions are very much necessary. System safety analyses involve the analysis of complex software architecture of the system, a major aspect in leading to fatal consequences in the behaviour of Safety-Critical Systems, and provide high reliability and dependability factors during their development. In this paper, we propose an architecture fault modeling and the safety analyses approach that will aid in identifying and eliminating the design flaws. The formal foundations of SAE Architecture Analysis & Design Language (AADL augmented with the Error Model Annex (EMV are discussed. The fault propagation, failure behaviour, and the composite behaviour of the design flaws/failures are considered for architecture safety analysis. The illustration of the proposed approach is validated by implementing the Speed Control Unit of Power-Boat Autopilot (PBA system. The Error Model Annex (EMV is guided with the pattern of consideration and inclusion of probable failure scenarios and propagation of fault conditions in the Speed Control Unit of Power-Boat Autopilot (PBA. This helps in validating the system architecture with the detection of the error event in the model and its impact in the operational environment. This also provides an insight of the certification impact that these exceptional conditions pose at various criticality levels and design assurance levels and its implications in verifying and validating the designs.

  9. Analysing the hidden curriculum: use of a cultural web.

    Science.gov (United States)

    Mossop, Liz; Dennick, Reg; Hammond, Richard; Robbé, Iain

    2013-02-01

    Major influences on learning about medical professionalism come from the hidden curriculum. These influences can contribute positively or negatively towards the professional enculturation of clinical students. The fact that there is no validated method for identifying the components of the hidden curriculum poses problems for educators considering professionalism. The aim of this study was to analyse whether a cultural web, adapted from a business context, might assist in the identification of elements of the hidden curriculum at a UK veterinary school. A qualitative approach was used. Seven focus groups consisting of three staff groups and four student groups were organised. Questioning was framed using the cultural web, which is a model used by business owners to assess their environment and consider how it affects their employees and customers. The focus group discussions were recorded, transcribed and analysed thematically using a combination of a priori and emergent themes. The cultural web identified elements of the hidden curriculum for both students and staff. These included: core assumptions; routines; rituals; control systems; organisational factors; power structures, and symbols. Discussions occurred about how and where these issues may affect students' professional identity development. The cultural web framework functioned well to help participants identify elements of the hidden curriculum. These aspects aligned broadly with previously described factors such as role models and institutional slang. The influence of these issues on a student's development of a professional identity requires discussion amongst faculty staff, and could be used to develop learning opportunities for students. The framework is promising for the analysis of the hidden curriculum and could be developed as an instrument for implementation in other clinical teaching environments. © Blackwell Publishing Ltd 2013.

  10. Psychometric analyses to improve the Dutch ICF Activity Inventory.

    Science.gov (United States)

    Bruijning, Janna E; van Rens, Ger; Knol, Dirk; van Nispen, Ruth

    2013-08-01

    In the past, rehabilitation centers for the visually impaired used unstructured or semistructured methods to assess rehabilitation needs of their patients. Recently, an extensive instrument, the Dutch ICF Activity Inventory (D-AI), was developed to systematically investigate rehabilitation needs of visually impaired adults and to evaluate rehabilitation outcomes. The purpose of this study was to investigate the underlying factor structure and other psychometric properties to shorten and improve the D-AI. The D-AI was administered to 241 visually impaired persons who recently enrolled in a multidisciplinary rehabilitation center. The D-AI uses graded scores to assess the importance and difficulty of 65 rehabilitation goals. For high-priority goals (e.g., daily meal preparation), the difficulty of underlying tasks (e.g., read recipes, cut vegetables) was assessed. To reduce underlying task items (>950), descriptive statistics were investigated and factor analyses were performed for several goals. The internal consistency reliability and test-retest reliability of the D-AI were investigated by calculating Cronbach α and Cohen (weighted) κ. Finally, consensus-based discussions were used to shorten and improve the D-AI. Except for one goal, factor analysis model parameters were at least reasonable. Internal consistency reliability was satisfactory (range, 0.74 to 0.93). In total, 60% of the 65 goal importance items and 84.4% of the goal difficulty items showed moderate to almost perfect κ values (≥0.40). After consensus-based discussions, a new D-AI was produced, containing 48 goals and less than 500 tasks. The analyses were an important step in the validation process of the D-AI and to develop a more feasible assessment tool to investigate rehabilitation needs of visually impaired persons in a systematic way. The D-AI is currently implemented in all Dutch rehabilitation centers serving all visually impaired adults with various rehabilitation needs.

  11. Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria

    Science.gov (United States)

    Palka, Jessica; Wessollek, Christine; Karrasch, Pierre

    2017-10-01

    The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in

  12. Drivers of Adoption and Implementation of Internet-Based Marketing Channels

    DEFF Research Database (Denmark)

    Nielsen, Jørn Flohr; Mols, Niels Peter; Høst, Viggo

    2007-01-01

    This chapter analyses factors influencing manufacturers= adoption and implementation of Internet-based marketing channels, using models based on marketing channel and organisational innovation theory. Survey data from 1163 Danish, Finnish, and Swedish manufacturers form the empirical basis for te...

  13. Establishing a framework to implement 4D XCAT Phantom for 4D radiotherapy research

    Directory of Open Access Journals (Sweden)

    Raj K Panta

    2012-01-01

    Conclusions: An integrated computer program has been developed to generate, review, analyse, process, and export the 4D XCAT images. A framework has been established to implement the 4D XCAT phantom for 4D RT research.

  14. How GPs implement clinical guidelines in everyday clinical practice

    DEFF Research Database (Denmark)

    Videbæk Le, Jette; Hansen, Helle P; Riisgaard, Helle

    2015-01-01

    BACKGROUND: Clinical guidelines are considered to be essential for improving quality and safety of health care. However, interventions to promote implementation of guidelines have demonstrated only partial effectiveness and the reasons for this apparent failure are not yet fully understood....... OBJECTIVE: To investigate how GPs implement clinical guidelines in everyday clinical practice and how implementation approaches differ between practices. METHODS: Individual semi-structured open-ended interviews with seven GPs who were purposefully sampled with regard to gender, age and practice form....... Interviews were recorded, transcribed verbatim and then analysed using systematic text condensation. RESULTS: Analysis of the interviews revealed three different approaches to the implementation of guidelines in clinical practice. In some practices the GPs prioritized time and resources on collective...

  15. GREENWAY IN ITALY: EXAMPLES OF PROJECTS AND IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Pasquale Dal Sasso

    2012-06-01

    Full Text Available The study analyses the greenway projects implemented in Italy, summarising their characteristics in a table that contains basic information concerning the geographical location, year of implementation, the type of layout and size, the state of implementation; the Institution that proposed the implementation, the official name, the territorial feature, the socio-economic and cultural aims, references to bibliographic and web resources and to their inclusion in plans and projects. The analysis has allowed to verify the compliance of individual contributions to the definitions attributed to the greenway from national and international associations. It has been possible to verify the use of greenways as physical support to spatial planning and the promotion of economic and productive development of rural areas.

  16. Mobilizing the Courage to Implement Sustainable Design Solutions

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Hoffmann, Birgitte; Quitzau, Maj-Britt

    2009-01-01

    of design managers and others to develop socio-technical networks and storylines to integrate sustainability in the design and building processes. Implementation of sustainable design solutions takes more than courage; it requires key competences in catalysing network changes......., the work focuses on examples of successful implementation in an attempt to understand the competences required. Danish frontrunner projects are described and analysed: one case concerns the implementation of lowenergy houses and another describes innovative planning processes in the water sector in order...... networks and creative work forms constitutes the outset for change. The work is inspired by the actor-network theory, emphasizing the momentum of prevailing practices, and the need to (re)develop networks to support implementation of sustainable design solutions. Conclusions point to the importance...

  17. Lidar data assimilation for improved analyses of volcanic aerosol events

    Science.gov (United States)

    Lange, Anne Caroline; Elbern, Hendrik

    2014-05-01

    data in a variational data assimilation algorithm. The implemented method is tested by the assimilation of CALIPSO attenuated backscatter data that were taken during the eruption of the Eyjafjallajökull volcano in April 2010. It turned out that the implemented module is fully capable to integrate unexpected aerosol events in an automatic way into reasonable analyses. The estimations of the aerosol mass concentrations showed promising properties for the application of observations that are taken by lidar systems with both, higher and lower sophistication than CALIOP.

  18. Database-Driven Analyses of Astronomical Spectra

    Science.gov (United States)

    Cami, Jan

    2012-03-01

    Spectroscopy is one of the most powerful tools to study the physical properties and chemical composition of very diverse astrophysical environments. In principle, each nuclide has a unique set of spectral features; thus, establishing the presence of a specific material at astronomical distances requires no more than finding a laboratory spectrum of the right material that perfectly matches the astronomical observations. Once the presence of a substance is established, a careful analysis of the observational characteristics (wavelengths or frequencies, intensities, and line profiles) allows one to determine many physical parameters of the environment in which the substance resides, such as temperature, density, velocity, and so on. Because of this great diagnostic potential, ground-based and space-borne astronomical observatories often include instruments to carry out spectroscopic analyses of various celestial objects and events. Of particular interest is molecular spectroscopy at infrared wavelengths. From the spectroscopic point of view, molecules differ from atoms in their ability to vibrate and rotate, and quantum physics inevitably causes those motions to be quantized. The energies required to excite vibrations or rotations are such that vibrational transitions generally occur at infrared wavelengths, whereas pure rotational transitions typically occur at sub-mm wavelengths. Molecular vibration and rotation are coupled though, and thus at infrared wavelengths, one commonly observes a multitude of ro-vibrational transitions (see Figure 13.1). At lower spectral resolution, all transitions blend into one broad ro-vibrational molecular band. The isotope. Molecular spectroscopy thus allows us to see a difference of one neutron in an atomic nucleus that is located at astronomical distances! Since the detection of the first interstellar molecules (the CH [21] and CN [14] radicals), more than 150 species have been detected in space, ranging in size from diatomic

  19. Tools for Trigger Aware Analyses in ATLAS

    CERN Document Server

    Krasznahorkay, A; The ATLAS collaboration; Stelzer, J

    2010-01-01

    In order to search for rare processes, all four LHC experiments have to use advanced triggering methods for selecting and recording the events of interest. At the expected nominal LHC operating conditions only about 0.0005% of the collision events can be kept for physics analysis in ATLAS. Therefore the understanding and evaluation of the trigger performance is one of the most crucial parts of any physics analysis. ATLAS’s first level trigger is composed of custom-built hardware, while the second and third levels are implemented using regular PCs running reconstruction and selection algorithms. Because of this split, accessing the results of the trigger execution for the two stages is different. The complexity of the software trigger presents further difficulties in accessing the trigger data. To make the job of the physicists easier when evaluating the trigger performance, multiple general-use tools are provided by the ATLAS Trigger Analysis Tools group. The TrigDecisionTool, a general tool, is provided to...

  20. Analysing News for Stock Market Prediction

    Science.gov (United States)

    Ramalingam, V. V.; Pandian, A.; Dwivedi, shivam; Bhatt, Jigar P.

    2018-04-01

    Stock market means the aggregation of all sellers and buyers of stocks representing their ownership claims on the business. To be completely absolute about the investment on these stocks, proper knowledge about them as well as their pricing, for both present and future is very essential. Large amount of data is collected and parsed to obtain this essential information regarding the fluctuations in the stock market. This data can be any news or public opinions in general. Recently, many methods have been used, especially big unstructured data methods to predict the stock market values. We introduce another method of focusing on deriving the best statistical learning model for predicting the future values. The data set used is very large unstructured data collected from an online social platform, commonly known as Quindl. The data from this platform is then linked to a csv fie and cleaned to obtain the essential information for stock market prediction. The method consists of carrying out the NLP (Natural Language Processing) of the data and then making it easier for the system to understand, finds and identifies the correlation in between this data and the stock market fluctuations. The model is implemented using Python Programming Language throughout the entire project to obtain flexibility and convenience of the system.

  1. High performance liquid chromatography in pharmaceutical analyses

    Directory of Open Access Journals (Sweden)

    Branko Nikolin

    2004-05-01

    Full Text Available In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatographyreplaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography(HPLC analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1 Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or

  2. High perfomance liquid chromatography in pharmaceutical analyses.

    Science.gov (United States)

    Nikolin, Branko; Imamović, Belma; Medanhodzić-Vuk, Saira; Sober, Miroslav

    2004-05-01

    In testing the pre-sale procedure the marketing of drugs and their control in the last ten years, high performance liquid chromatography replaced numerous spectroscopic methods and gas chromatography in the quantitative and qualitative analysis. In the first period of HPLC application it was thought that it would become a complementary method of gas chromatography, however, today it has nearly completely replaced gas chromatography in pharmaceutical analysis. The application of the liquid mobile phase with the possibility of transformation of mobilized polarity during chromatography and all other modifications of mobile phase depending upon the characteristics of substance which are being tested, is a great advantage in the process of separation in comparison to other methods. The greater choice of stationary phase is the next factor which enables realization of good separation. The separation line is connected to specific and sensitive detector systems, spectrafluorimeter, diode detector, electrochemical detector as other hyphernated systems HPLC-MS and HPLC-NMR, are the basic elements on which is based such wide and effective application of the HPLC method. The purpose high performance liquid chromatography (HPLC) analysis of any drugs is to confirm the identity of a drug and provide quantitative results and also to monitor the progress of the therapy of a disease.1) Measuring presented on the Fig. 1. is chromatogram obtained for the plasma of depressed patients 12 h before oral administration of dexamethasone. It may also be used to further our understanding of the normal and disease process in the human body trough biomedical and therapeutically research during investigation before of the drugs registration. The analyses of drugs and metabolites in biological fluids, particularly plasma, serum or urine is one of the most demanding but one of the most common uses of high performance of liquid chromatography. Blood, plasma or serum contains numerous endogenous

  3. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS).

    Science.gov (United States)

    Ehrhart, Mark G; Aarons, Gregory A; Farahnak, Lauren R

    2014-10-23

    Although the importance of the organizational environment for implementing evidence-based practices (EBP) has been widely recognized, there are limited options for measuring implementation climate in public sector health settings. The goal of this research was to develop and test a measure of EBP implementation climate that would both capture a broad range of issues important for effective EBP implementation and be of practical use to researchers and managers seeking to understand and improve the implementation of EBPs. Participants were 630 clinicians working in 128 work groups in 32 US-based mental health agencies. Items to measure climate for EBP implementation were developed based on past literature on implementation climate and other strategic climates and in consultation with experts on the implementation of EBPs in mental health settings. The sample was randomly split at the work group level of analysis; half of the sample was used for exploratory factor analysis (EFA), and the other half was used for confirmatory factor analysis (CFA). The entire sample was utilized for additional analyses assessing the reliability, support for level of aggregation, and construct-based evidence of validity. The EFA resulted in a final factor structure of six dimensions for the Implementation Climate Scale (ICS): 1) focus on EBP, 2) educational support for EBP, 3) recognition for EBP, 4) rewards for EBP, 5) selection for EBP, and 6) selection for openness. This structure was supported in the other half of the sample using CFA. Additional analyses supported the reliability and construct-based evidence of validity for the ICS, as well as the aggregation of the measure to the work group level. The ICS is a very brief (18 item) and pragmatic measure of a strategic climate for EBP implementation. It captures six dimensions of the organizational context that indicate to employees the extent to which their organization prioritizes and values the successful implementation of EBPs

  4. Uncertainty Analyses for Back Projection Methods

    Science.gov (United States)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  5. Scanning electron microscopy and micro-analyses

    International Nuclear Information System (INIS)

    Brisset, F.; Repoux, L.; Ruste, J.; Grillon, F.; Robaut, F.

    2008-01-01

    Scanning electron microscopy (SEM) and the related micro-analyses are involved in extremely various domains, from the academic environments to the industrial ones. The overall theoretical bases, the main technical characteristics, and some complements of information about practical usage and maintenance are developed in this book. high-vacuum and controlled-vacuum electron microscopes are thoroughly presented, as well as the last generation of EDS (energy dispersive spectrometer) and WDS (wavelength dispersive spectrometer) micro-analysers. Beside these main topics, other analysis or observation techniques are approached, such as EBSD (electron backscattering diffraction), 3-D imaging, FIB (focussed ion beams), Monte-Carlo simulations, in-situ tests etc.. This book, in French language, is the only one which treats of this subject in such an exhaustive way. It represents the actualized and totally updated version of a previous edition of 1979. It gathers the lectures given in 2006 at the summer school of Saint Martin d'Heres (France). Content: 1 - electron-matter interactions; 2 - characteristic X-radiation, Bremsstrahlung; 3 - electron guns in SEM; 4 - elements of electronic optics; 5 - vacuum techniques; 6 - detectors used in SEM; 7 - image formation and optimization in SEM; 7a - SEM practical instructions for use; 8 - controlled pressure microscopy; 8a - applications; 9 - energy selection X-spectrometers (energy dispersive spectrometers - EDS); 9a - EDS analysis; 9b - X-EDS mapping; 10 - technological aspects of WDS; 11 - processing of EDS and WDS spectra; 12 - X-microanalysis quantifying methods; 12a - quantitative WDS microanalysis of very light elements; 13 - statistics: precision and detection limits in microanalysis; 14 - analysis of stratified samples; 15 - crystallography applied to EBSD; 16 - EBSD: history, principle and applications; 16a - EBSD analysis; 17 - Monte Carlo simulation; 18 - insulating samples in SEM and X-ray microanalysis; 18a - insulating

  6. Mixed Methods for Implementation Research: Application to Evidence-Based Practice Implementation and Staff Turnover in Community Based Organizations Providing Child Welfare Services

    Science.gov (United States)

    Aarons, Gregory A.; Fettes, Danielle L.; Sommerfeld, David H.; Palinkas, Lawrence

    2013-01-01

    Many public sector services systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well-suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This paper describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. We integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research. PMID:22146861

  7. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services.

    Science.gov (United States)

    Aarons, Gregory A; Fettes, Danielle L; Sommerfeld, David H; Palinkas, Lawrence A

    2012-02-01

    Many public sector service systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This article describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. The authors integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research.

  8. Scleral topography analysed by optical coherence tomography.

    Science.gov (United States)

    Bandlitz, Stefan; Bäumer, Joachim; Conrad, Uwe; Wolffsohn, James

    2017-08-01

    A detailed evaluation of the corneo-scleral-profile (CSP) is of particular relevance in soft and scleral lenses fitting. The aim of this study was to use optical coherence tomography (OCT) to analyse the profile of the limbal sclera and to evaluate the relationship between central corneal radii, corneal eccentricity and scleral radii. Using OCT (Optos OCT/SLO; Dunfermline, Scotland, UK) the limbal scleral radii (SR) of 30 subjects (11M, 19F; mean age 23.8±2.0SD years) were measured in eight meridians 45° apart. Central corneal radii (CR) and corneal eccentricity (CE) were evaluated using the Oculus Keratograph 4 (Oculus, Wetzlar, Germany). Differences between SR in the meridians and the associations between SR and corneal topography were assessed. Median SR measured along 45° (58.0; interquartile range, 46.8-84.8mm) was significantly (ptopography and may provide additional data useful in fitting soft and scleral contact lenses. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  9. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  10. Analyses of demand response in Denmark

    International Nuclear Information System (INIS)

    Moeller Andersen, F.; Grenaa Jensen, S.; Larsen, Helge V.; Meibom, P.; Ravn, H.; Skytte, K.; Togeby, M.

    2006-10-01

    Due to characteristics of the power system, costs of producing electricity vary considerably over short time intervals. Yet, many consumers do not experience corresponding variations in the price they pay for consuming electricity. The topic of this report is: are consumers willing and able to respond to short-term variations in electricity prices, and if so, what is the social benefit of consumers doing so? Taking Denmark and the Nord Pool market as a case, the report focuses on what is known as short-term consumer flexibility or demand response in the electricity market. With focus on market efficiency, efficient allocation of resources and security of supply, the report describes demand response from a micro-economic perspective and provides empirical observations and case studies. The report aims at evaluating benefits from demand response. However, only elements contributing to an overall value are presented. In addition, the analyses are limited to benefits for society, and costs of obtaining demand response are not considered. (au)

  11. WIND SPEED AND ENERGY POTENTIAL ANALYSES

    Directory of Open Access Journals (Sweden)

    A. TOKGÖZLÜ

    2013-01-01

    Full Text Available This paper provides a case study on application of wavelet techniques to analyze wind speed and energy (renewable and environmental friendly energy. Solar and wind are main sources of energy that allows farmers to have the potential for transferring kinetic energy captured by the wind mill for pumping water, drying crops, heating systems of green houses, rural electrification's or cooking. Larger wind turbines (over 1 MW can pump enough water for small-scale irrigation. This study tried to initiate data gathering process for wavelet analyses, different scale effects and their role on wind speed and direction variations. The wind data gathering system is mounted at latitudes: 37° 50" N; longitude 30° 33" E and height: 1200 m above mean sea level at a hill near Süleyman Demirel University campus. 10 minutes average values of two levels wind speed and direction (10m and 30m above ground level have been recorded by a data logger between July 2001 and February 2002. Wind speed values changed between the range of 0 m/s and 54 m/s. Annual mean speed value is 4.5 m/s at 10 m ground level. Prevalent wind

  12. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    S. Tsai

    2005-01-12

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2.

  13. Soil deflation analyses from wind erosion events

    Directory of Open Access Journals (Sweden)

    Lenka Lackóová

    2015-09-01

    Full Text Available There are various methods to assess soil erodibility for wind erosion. This paper focuses on aggregate analysis by a laser particle sizer ANALYSETTE 22 (FRITSCH GmbH, made to determine the size distribution of soil particles detached by wind (deflated particles. Ten soil samples, trapped along the same length of the erosion surface (150–155 m but at different wind speeds, were analysed. The soil was sampled from a flat, smooth area without vegetation cover or soil crust, not affected by the impact of windbreaks or other barriers, from a depth of maximum 2.5 cm. Prior to analysis the samples were prepared according to the relevant specifications. An experiment was also conducted using a device that enables characterisation of the vertical movement of the deflated material. The trapped samples showed no differences in particle size and the proportions of size fractions at different hourly average wind speeds. It was observed that most of particles travelling in saltation mode (size 50–500 μm – 58–70% – moved vertically up to 26 cm above the soil surface. At greater heights, particles moving in suspension mode (floating in the air; size < 100 μm accounted for up to 90% of the samples. This result suggests that the boundary between the two modes of the vertical movement of deflated soil particles lies at about 25 cm above the soil surface.

  14. Genomic analyses of modern dog breeds.

    Science.gov (United States)

    Parker, Heidi G

    2012-02-01

    A rose may be a rose by any other name, but when you call a dog a poodle it becomes a very different animal than if you call it a bulldog. Both the poodle and the bulldog are examples of dog breeds of which there are >400 recognized worldwide. Breed creation has played a significant role in shaping the modern dog from the length of his leg to the cadence of his bark. The selection and line-breeding required to maintain a breed has also reshaped the genome of the dog, resulting in a unique genetic pattern for each breed. The breed-based population structure combined with extensive morphologic variation and shared human environments have made the dog a popular model for mapping both simple and complex traits and diseases. In order to obtain the most benefit from the dog as a genetic system, it is necessary to understand the effect structured breeding has had on the genome of the species. That is best achieved by looking at genomic analyses of the breeds, their histories, and their relationships to each other.

  15. Achieving reasonable conservatism in nuclear safety analyses

    International Nuclear Information System (INIS)

    Jamali, Kamiar

    2015-01-01

    In the absence of methods that explicitly account for uncertainties, seeking reasonable conservatism in nuclear safety analyses can quickly lead to extreme conservatism. The rate of divergence to extreme conservatism is often beyond the expert analysts’ intuitive feeling, but can be demonstrated mathematically. Too much conservatism in addressing the safety of nuclear facilities is not beneficial to society. Using certain properties of lognormal distributions for representation of input parameter uncertainties, example calculations for the risk and consequence of a fictitious facility accident scenario are presented. Results show that there are large differences between the calculated 95th percentiles and the extreme bounding values derived from using all input variables at their upper-bound estimates. Showing the relationship of the mean values to the key parameters of the output distributions, the paper concludes that the mean is the ideal candidate for representation of the value of an uncertain parameter. The mean value is proposed as the metric that is consistent with the concept of reasonable conservatism in nuclear safety analysis, because its value increases towards higher percentiles of the underlying positively skewed distribution with increasing levels of uncertainty. Insensitivity of the results to the actual underlying distributions is briefly demonstrated. - Highlights: • Multiple conservative assumptions can quickly diverge into extreme conservatism. • Mathematics and attractive properties provide basis for wide use of lognormal distribution. • Mean values are ideal candidates for representation of parameter uncertainties. • Mean values are proposed as reasonably conservative estimates of parameter uncertainties

  16. CFD analyses of coolant channel flowfields

    Science.gov (United States)

    Yagley, Jennifer A.; Feng, Jinzhang; Merkle, Charles L.

    1993-01-01

    The flowfield characteristics in rocket engine coolant channels are analyzed by means of a numerical model. The channels are characterized by large length to diameter ratios, high Reynolds numbers, and asymmetrical heating. At representative flow conditions, the channel length is approximately twice the hydraulic entrance length so that fully developed conditions would be reached for a constant property fluid. For the supercritical hydrogen that is used as the coolant, the strong property variations create significant secondary flows in the cross-plane which have a major influence on the flow and the resulting heat transfer. Comparison of constant and variable property solutions show substantial differences. In addition, the property variations prevent fully developed flow. The density variation accelerates the fluid in the channels increasing the pressure drop without an accompanying increase in heat flux. Analyses of the inlet configuration suggest that side entry from a manifold can affect the development of the velocity profile because of vortices generated as the flow enters the channel. Current work is focused on studying the effects of channel bifurcation on the flow field and the heat transfer characteristics.

  17. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  18. Mediation Analyses in the Real World

    DEFF Research Database (Denmark)

    Lange, Theis; Starkopf, Liis

    2016-01-01

    The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion of restr......The paper by Nguyen et al.1 published in this issue of Epidemiology presents a comparison of the recently suggested inverse odds ratio approach for addressing mediation and a more conventional Baron and Kenny-inspired method. Interestingly, the comparison is not done through a discussion...... it simultaneously ensures that the comparison is based on properties, which matter in actual applications, and makes the comparison accessible for a broader audience. In a wider context, the choice to stay close to real-life problems mirrors a general trend within the literature on mediation analysis namely to put...... applications using the inverse odds ration approach, as it simply has not had enough time to move from theoretical concept to published applied paper, we do expect to be able to judge the willingness of authors and journals to employ the causal inference-based approach to mediation analyses. Our hope...

  19. Activation analyses for different fusion structural alloys

    International Nuclear Information System (INIS)

    Attaya, H.; Smith, D.

    1991-01-01

    The leading candidate structural materials, viz., the vanadium alloys, the nickel or the manganese stabilized austenitic steels, and the ferritic steels, are analysed in terms of their induced activation in the TPSS fusion power reactor. The TPSS reactor has 1950 MW fusion power and inboard and outboard average neutron wall loading of 3.75 and 5.35 MW/m 2 respectively. The results shows that, after one year of continuous operation, the vanadium alloys have the least radioactivity at reactor shutdown. The maximum difference between the induced radioactivity in the vanadium alloys and in the other iron-based alloys occurs at about 10 years after reactor shutdown. At this time, the total reactor radioactivity, using the vanadium alloys, is about two orders of magnitude less than the total reactor radioactivity utilizing any other alloy. The difference is even larger in the first wall, the FW-vanadium activation is 3 orders of magnitude less than other alloys' FW activation. 2 refs., 7 figs

  20. Statistical analyses of extreme food habits

    International Nuclear Information System (INIS)

    Breuninger, M.; Neuhaeuser-Berthold, M.

    2000-01-01

    This report is a summary of the results of the project ''Statistical analyses of extreme food habits'', which was ordered from the National Office for Radiation Protection as a contribution to the amendment of the ''General Administrative Regulation to paragraph 45 of the Decree on Radiation Protection: determination of the radiation exposition by emission of radioactive substances from facilities of nuclear technology''. Its aim is to show if the calculation of the radiation ingested by 95% of the population by food intake, like it is planned in a provisional draft, overestimates the true exposure. If such an overestimation exists, the dimension of it should be determined. It was possible to prove the existence of this overestimation but its dimension could only roughly be estimated. To identify the real extent of it, it is necessary to include the specific activities of the nuclides, which were not available for this investigation. In addition to this the report shows how the amounts of food consumption of different groups of foods influence each other and which connections between these amounts should be taken into account, in order to estimate the radiation exposition as precise as possible. (orig.) [de

  1. Evaluation of the Olympus AU-510 analyser.

    Science.gov (United States)

    Farré, C; Velasco, J; Ramón, F

    1991-01-01

    The selective multitest Olympus AU-510 analyser was evaluated according to the recommendations of the Comision de Instrumentacion de la Sociedad Española de Quimica Clinica and the European Committee for Clinical Laboratory Standards. The evaluation was carried out in two stages: an examination of the analytical units and then an evaluation in routine work conditions. The operational characteristics of the system were also studied.THE FIRST STAGE INCLUDED A PHOTOMETRIC STUDY: dependent on the absorbance, the inaccuracy varies between +0.5% to -0.6% at 405 nm and from -5.6% to 10.6% at 340 nm; the imprecision ranges between -0.22% and 0.56% at 405 nm and between 0.09% and 2.74% at 340 nm. Linearity was acceptable, apart from a very low absorbance for NADH at 340 nm; and the imprecision of the serum sample pipetter was satisfactory.TWELVE SERUM ANALYTES WERE STUDIED UNDER ROUTINE CONDITIONS: glucose, urea urate, cholesterol, triglycerides, total bilirubin, creatinine, phosphate, iron, aspartate aminotransferase, alanine aminotransferase and gamma-glutamyl transferase.The within-run imprecision (CV%) ranged from 0.67% for phosphate to 2.89% for iron and the between-run imprecision from 0.97% for total bilirubin to 7.06% for iron. There was no carryover in a study of the serum sample pipetter. Carry-over studies with the reagent and sample pipetters shows some cross contamination in the iron assay.

  2. PRECLOSURE CONSEQUENCE ANALYSES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    S. Tsai

    2005-01-01

    Radiological consequence analyses are performed for potential releases from normal operations in surface and subsurface facilities and from Category 1 and Category 2 event sequences during the preclosure period. Surface releases from normal repository operations are primarily from radionuclides released from opening a transportation cask during dry transfer operations of spent nuclear fuel (SNF) in Dry Transfer Facility 1 (DTF 1), Dry Transfer Facility 2 (DTF 2), the Canister Handling facility (CHF), or the Fuel Handling Facility (FHF). Subsurface releases from normal repository operations are from resuspension of waste package surface contamination and neutron activation of ventilated air and silica dust from host rock in the emplacement drifts. The purpose of this calculation is to demonstrate that the preclosure performance objectives, specified in 10 CFR 63.111(a) and 10 CFR 63.111(b), have been met for the proposed design and operations in the geologic repository operations area. Preclosure performance objectives are discussed in Section 6.2.3 and are summarized in Tables 1 and 2

  3. Genomic analyses of the CAM plant pineapple.

    Science.gov (United States)

    Zhang, Jisen; Liu, Juan; Ming, Ray

    2014-07-01

    The innovation of crassulacean acid metabolism (CAM) photosynthesis in arid and/or low CO2 conditions is a remarkable case of adaptation in flowering plants. As the most important crop that utilizes CAM photosynthesis, the genetic and genomic resources of pineapple have been developed over many years. Genetic diversity studies using various types of DNA markers led to the reclassification of the two genera Ananas and Pseudananas and nine species into one genus Ananas and two species, A. comosus and A. macrodontes with five botanical varieties in A. comosus. Five genetic maps have been constructed using F1 or F2 populations, and high-density genetic maps generated by genotype sequencing are essential resources for sequencing and assembling the pineapple genome and for marker-assisted selection. There are abundant expression sequence tag resources but limited genomic sequences in pineapple. Genes involved in the CAM pathway has been analysed in several CAM plants but only a few of them are from pineapple. A reference genome of pineapple is being generated and will accelerate genetic and genomic research in this major CAM crop. This reference genome of pineapple provides the foundation for studying the origin and regulatory mechanism of CAM photosynthesis, and the opportunity to evaluate the classification of Ananas species and botanical cultivars. © The Author 2014. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. Social Media Analyses for Social Measurement

    Science.gov (United States)

    Schober, Michael F.; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G.

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or “found” social media content. But just how trustworthy such measurement can be—say, to replace official statistics—is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys. PMID:27257310

  5. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  6. System for analysing sickness absenteeism in Poland.

    Science.gov (United States)

    Indulski, J A; Szubert, Z

    1997-01-01

    The National System of Sickness Absenteeism Statistics has been functioning in Poland since 1977, as the part of the national health statistics. The system is based on a 15-percent random sample of copies of certificates of temporary incapacity for work issued by all health care units and authorised private medical practitioners. A certificate of temporary incapacity for work is received by every insured employee who is compelled to stop working due to sickness, accident, or due to the necessity to care for a sick member of his/her family. The certificate is required on the first day of sickness. Analyses of disease- and accident-related sickness absenteeism carried out each year in Poland within the statistical system lead to the main conclusions: 1. Diseases of the musculoskeletal and peripheral nervous systems accounting, when combined, for 1/3 of the total sickness absenteeism, are a major health problem of the working population in Poland. During the past five years, incapacity for work caused by these diseases in males increased 2.5 times. 2. Circulatory diseases, and arterial hypertension and ischaemic heart disease in particular (41% and 27% of sickness days, respectively), create an essential health problem among males at productive age, especially, in the 40 and older age group. Absenteeism due to these diseases has increased in males more than two times.

  7. Comparative analyses of bidirectional promoters in vertebrates

    Directory of Open Access Journals (Sweden)

    Taylor James

    2008-05-01

    Full Text Available Abstract Background Orthologous genes with deep phylogenetic histories are likely to retain similar regulatory features. In this report we utilize orthology assignments for pairs of genes co-regulated by bidirectional promoters to map the ancestral history of the promoter regions. Results Our mapping of bidirectional promoters from humans to fish shows that many such promoters emerged after the divergence of chickens and fish. Furthermore, annotations of promoters in deep phylogenies enable detection of missing data or assembly problems present in higher vertebrates. The functional importance of bidirectional promoters is indicated by selective pressure to maintain the arrangement of genes regulated by the promoter over long evolutionary time spans. Characteristics unique to bidirectional promoters are further elucidated using a technique for unsupervised classification, known as ESPERR. Conclusion Results of these analyses will aid in our understanding of the evolution of bidirectional promoters, including whether the regulation of two genes evolved as a consequence of their proximity or if function dictated their co-regulation.

  8. Thermomagnetic Analyses to Test Concrete Stability

    Science.gov (United States)

    Geiss, C. E.; Gourley, J. R.

    2017-12-01

    Over the past decades pyrrhotite-containing aggregate has been used in concrete to build basements and foundations in central Connecticut. The sulphur in the pyrrhotite reacts to several secondary minerals, and associated changes in volume lead to a loss of structural integrity. As a result hundreds of homes have been rendered worthless as remediation costs often exceed the value of the homes and the value of many other homes constructed during the same time period is in question as concrete provenance and potential future structural issues are unknown. While minor abundances of pyrrhotite are difficult to detect or quantify by traditional means, the mineral is easily identified through its magnetic properties. All concrete samples from affected homes show a clear increase in magnetic susceptibility above 220°C due to the γ - transition of Fe9S10 [1] and a clearly defined Curie-temperature near 320°C for Fe7S8. X-ray analyses confirm the presence of pyrrhotite and ettringite in these samples. Synthetic mixtures of commercially available concrete and pyrrhotite show that the method is semiquantitative but needs to be calibrated for specific pyrrhotite mineralogies. 1. Schwarz, E.J., Magnetic properties of pyrrhotite and their use in applied geology and geophysics. 1975, Geological Survey of Canada : Ottawa, ON, Canada: Canada.

  9. Social Media Analyses for Social Measurement.

    Science.gov (United States)

    Schober, Michael F; Pasek, Josh; Guggenheim, Lauren; Lampe, Cliff; Conrad, Frederick G

    2016-01-01

    Demonstrations that analyses of social media content can align with measurement from sample surveys have raised the question of whether survey research can be supplemented or even replaced with less costly and burdensome data mining of already-existing or "found" social media content. But just how trustworthy such measurement can be-say, to replace official statistics-is unknown. Survey researchers and data scientists approach key questions from starting assumptions and analytic traditions that differ on, for example, the need for representative samples drawn from frames that fully cover the population. New conversations between these scholarly communities are needed to understand the potential points of alignment and non-alignment. Across these approaches, there are major differences in (a) how participants (survey respondents and social media posters) understand the activity they are engaged in; (b) the nature of the data produced by survey responses and social media posts, and the inferences that are legitimate given the data; and (c) practical and ethical considerations surrounding the use of the data. Estimates are likely to align to differing degrees depending on the research topic and the populations under consideration, the particular features of the surveys and social media sites involved, and the analytic techniques for extracting opinions and experiences from social media. Traditional population coverage may not be required for social media content to effectively predict social phenomena to the extent that social media content distills or summarizes broader conversations that are also measured by surveys.

  10. Validating experimental and theoretical Langmuir probe analyses

    Science.gov (United States)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  11. How to implement ITIL successfully?

    OpenAIRE

    Wang, Jingwen; Khosravi Sereshki, Hami

    2010-01-01

    The purpose of this thesis is to reveal how Information Technology  Infrastructure Library (ITIL) should be implemented in an organization in  an efficient and effective way to achieve the goal of reducing wastage, cutting costs and increasing customers’ satisfaction. There are many books dealing with the ITIL. But these publications do not prescribe how to adopt, adapt or implement the guidelines as part of a service management strategy; it would seem useful to explore implementation strateg...

  12. Implementing New Ways of Working

    DEFF Research Database (Denmark)

    Granlien, Maren Sander; Hertzum, Morten

    2009-01-01

    Successful deployment of information technology (IT) involves implementation of new ways of working. Under-recognition of this organizational element of implementation entails considerable risk of not attaining the benefits that motivated deployment, yet knowledge of how to work systematically...... were devised and performed as part of the study, significantly lowered the number of records that violated the procedure. This positive effect was, however, not achieved until multiple interventions had been employed, and there is some indication that the effect may be wearing off after...... the interventions have ended. We discuss the implications of these results for efforts to work systematically with the organizational implementation of IT systems....

  13. Building integrated business environments: analysing open-source ESB

    Science.gov (United States)

    Martínez-Carreras, M. A.; García Jimenez, F. J.; Gómez Skarmeta, A. F.

    2015-05-01

    Integration and interoperability are two concepts that have gained significant prominence in the business field, providing tools which enable enterprise application integration (EAI). In this sense, enterprise service bus (ESB) has played a crucial role as the underpinning technology for creating integrated environments in which companies may connect all their legacy-applications. However, the potential of these technologies remains unknown and some important features are not used to develop suitable business environments. The aim of this paper is to describe and detail the elements for building the next generation of integrated business environments (IBE) and to analyse the features of ESBs as the core of this infrastructure. For this purpose, we evaluate how well-known open-source ESB products fulfil these needs. Moreover, we introduce a scenario in which the collaborative system 'Alfresco' is integrated in the business infrastructure. Finally, we provide a comparison of the different open-source ESBs available for IBE requirements. According to this study, Fuse ESB provides the best results, considering features such as support for a wide variety of standards and specifications, documentation and implementation, security, advanced business trends, ease of integration and performance.

  14. Spark and HPC for High Energy Physics Data Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc

    2017-05-01

    A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be represented and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.

  15. Bench top and portable mineral analysers, borehole core analysers and in situ borehole logging

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    Bench top and portable mineral analysers are usually based on balanced filter techniques using scintillation detectors or on low resolution proportional detectors. The application of radioisotope x-ray techniques to in situ borehole logging is increasing, and is particularly suited for logging for tin and higher atomic number elements

  16. Trigeneration Possibilities of Implementation at CERN

    CERN Document Server

    Kühnl-Kinel, J

    1999-01-01

    Optimum distribution of energy supply systems can result in large savings in industrial facilities and production devices. Identifying the configuration of existing equipment and its loading, in order to minimize total energy consumption and at the same time satisfy given load demands, has very high payback potential. This paper presents the principle of trigeneration, the technology that can offer a highly efficient way of converting primary fuel (gas, oil) into useful energy as electricity, heat and chilled water simultaneously. It explains different factors that must be considered for such systems to be economically feasible. Some examples of industrial trigeneration systems are analysed and discussed to illustrate the application. Also the possibility of implementation of trigeneration at CERN is discussed, taking into account the existing cogeneration system, power supply structure, secondary energy demands, as well as future developments in our energy policy.

  17. Implementing database system for LHCb publications page

    CERN Document Server

    Abdullayev, Fakhriddin

    2017-01-01

    The LHCb is one of the main detectors of Large Hadron Collider, where physicists and scientists work together on high precision measurements of matter-antimatter asymmetries and searches for rare and forbidden decays, with the aim of discovering new and unexpected forces. The work does not only consist of analyzing data collected from experiments but also in publishing the results of those analyses. The LHCb publications are gathered on LHCb publications page to maximize their availability to both LHCb members and to the high energy community. In this project a new database system was implemented for LHCb publications page. This will help to improve access to research papers for scientists and better integration with current CERN library website and others.

  18. Neural Networks: Implementations and Applications

    OpenAIRE

    Vonk, E.; Veelenturf, L.P.J.; Jain, L.C.

    1996-01-01

    Artificial neural networks, also called neural networks, have been used successfully in many fields including engineering, science and business. This paper presents the implementation of several neural network simulators and their applications in character recognition and other engineering areas

  19. FAA Loran early implementation project

    Science.gov (United States)

    1990-03-01

    The Early Implementation Project (EIP), established by FAA Administrator Admiral : Donald C. Engen, was the initial step in the process of Loran integration into the : National Airsace System (NAS). The EIP was designed to give the FAA and the Loran ...

  20. Document Management Projects: implementation guide

    Directory of Open Access Journals (Sweden)

    Beatriz Bagoin Guimarães

    2016-12-01

    Full Text Available Records Management System implementation is a complex process that needs to be executed by a multidisciplinary team and involves components of apparently non-related areas such as archival science, computer engineering, law, project management and human resource management. All of them are crucial and complementary to guarantee a full and functional implementation of a system and a perfect fusion with the connected processes and procedures. The purpose of this work is to provide organizations with a basic guide to Records Management Project implementation beginning with the steps prior to acquiring the system, following with the main project activities and concluding with the post implementation procedures of continuous improvement and system maintenance.

  1. NMCI: History, Implementation, and Change

    National Research Council Canada - National Science Library

    Taylor, Gregory S

    2006-01-01

    ...). The hope was to have the new intranet fully operational in just two years, but the program encountered so many difficulties that, almost six years later, the initial implementation process is still underway...

  2. Nurses' experiences of guideline implementation

    DEFF Research Database (Denmark)

    Alanen, Seija; Välimäki, Marita; Kaila, Minna

    2009-01-01

    AIMS: The aim of the study was to address the following questions: What kind of experiences do primary care nurses have of guideline implementation? What do nurses think are the most important factors affecting the adoption of guidelines? BACKGROUND: The implementation of clinical guidelines seems...... to be dependent on multiple context-specific factors. This study sets out to explore the experiences of primary care nurses concerning guideline implementation. DESIGN: Qualitative interview. METHODS: Data were generated by four focus group interviews involving nurses working in out-patient services in primary...... to nurses, (iii) factors related to the anticipated consequences and (iv) factors related to the patient group. Nurses' awareness and acceptance of guidelines and the anticipated positive consequences facilitate the implementation of guidelines. Organisational support, especially the adapting of guidelines...

  3. Parallel implementation of geometric transformations

    Energy Technology Data Exchange (ETDEWEB)

    Clarke, K A; Ip, H H.S.

    1982-10-01

    An implementation of digitized picture rotation and magnification based on Weiman's algorithm is presented. In a programmable array machine routines to perform small transformations code efficiently. The method illustrates the interpolative nature of the algorithm. 6 references.

  4. Implementing an Employee Assistance Program.

    Science.gov (United States)

    Gam, John; And Others

    1983-01-01

    Describes in detail the implementation of an employee assistance program in a textile plant. Reviews the historical development, referral process, and termination guidelines of the program and contains descriptive statistics for six periods of the program's operation. (Author/JAC)

  5. Coral Reef Protection Implementation Plan

    National Research Council Canada - National Science Library

    Lobel, Lisa

    2000-01-01

    This document identify policies and actions to implement the Department of Defense's responsibilities under Executive Order 13089 on Coral Reef Protection, and are a requirement of the interim Task...

  6. Transient Seepage for Levee Engineering Analyses

    Science.gov (United States)

    Tracy, F. T.

    2017-12-01

    Historically, steady-state seepage analyses have been a key tool for designing levees by practicing engineers. However, with the advances in computer modeling, transient seepage analysis has become a potentially viable tool. A complication is that the levees usually have partially saturated flow, and this is significantly more complicated in transient flow. This poster illustrates four elements of our research in partially saturated flow relating to the use of transient seepage for levee design: (1) a comparison of results from SEEP2D, SEEP/W, and SLIDE for a generic levee cross section common to the southeastern United States; (2) the results of a sensitivity study of varying saturated hydraulic conductivity, the volumetric water content function (as represented by van Genuchten), and volumetric compressibility; (3) a comparison of when soils do and do not exhibit hysteresis, and (4) a description of proper and improper use of transient seepage in levee design. The variables considered for the sensitivity and hysteresis studies are pore pressure beneath the confining layer at the toe, the flow rate through the levee system, and a levee saturation coefficient varying between 0 and 1. Getting results for SEEP2D, SEEP/W, and SLIDE to match proved more difficult than expected. After some effort, the results matched reasonably well. Differences in results were caused by various factors, including bugs, different finite element meshes, different numerical formulations of the system of nonlinear equations to be solved, and differences in convergence criteria. Varying volumetric compressibility affected the above test variables the most. The levee saturation coefficient was most affected by the use of hysteresis. The improper use of pore pressures from a transient finite element seepage solution imported into a slope stability computation was found to be the most grievous mistake in using transient seepage in the design of levees.

  7. Summary of the analyses for recovery factors

    Science.gov (United States)

    Verma, Mahendra K.

    2017-07-17

    IntroductionIn order to determine the hydrocarbon potential of oil reservoirs within the U.S. sedimentary basins for which the carbon dioxide enhanced oil recovery (CO2-EOR) process has been considered suitable, the CO2 Prophet model was chosen by the U.S. Geological Survey (USGS) to be the primary source for estimating recovery-factor values for individual reservoirs. The choice was made because of the model’s reliability and the ease with which it can be used to assess a large number of reservoirs. The other two approaches—the empirical decline curve analysis (DCA) method and a review of published literature on CO2-EOR projects—were deployed to verify the results of the CO2 Prophet model. This chapter discusses the results from CO2 Prophet (chapter B, by Emil D. Attanasi, this report) and compares them with results from decline curve analysis (chapter C, by Hossein Jahediesfanjani) and those reported in the literature for selected reservoirs with adequate data for analyses (chapter D, by Ricardo A. Olea).To estimate the technically recoverable hydrocarbon potential for oil reservoirs where CO2-EOR has been applied, two of the three approaches—CO2 Prophet modeling and DCA—do not include analysis of economic factors, while the third approach—review of published literature—implicitly includes economics. For selected reservoirs, DCA has provided estimates of the technically recoverable hydrocarbon volumes, which, in combination with calculated amounts of original oil in place (OOIP), helped establish incremental CO2-EOR recovery factors for individual reservoirs.The review of published technical papers and reports has provided substantial information on recovery factors for 70 CO2-EOR projects that are either commercially profitable or classified as pilot tests. When comparing the results, it is important to bear in mind the differences and limitations of these three approaches.

  8. The ABC (Analysing Biomolecular Contacts-database

    Directory of Open Access Journals (Sweden)

    Walter Peter

    2007-03-01

    Full Text Available As protein-protein interactions are one of the basic mechanisms in most cellular processes, it is desirable to understand the molecular details of protein-protein contacts and ultimately be able to predict which proteins interact. Interface areas on a protein surface that are involved in protein interactions exhibit certain characteristics. Therefore, several attempts were made to distinguish protein interactions from each other and to categorize them. One way of classification are the groups of transient and permanent interactions. Previously two of the authors analysed several properties for transient complexes such as the amino acid and secondary structure element composition and pairing preferences. Certainly, interfaces can be characterized by many more possible attributes and this is a subject of intense ongoing research. Although several freely available online databases exist that illuminate various aspects of protein-protein interactions, we decided to construct a new database collecting all desired interface features allowing for facile selection of subsets of complexes. As database-server we applied MySQL and the program logic was written in JAVA. Furthermore several class extensions and tools such as JMOL were included to visualize the interfaces and JfreeChart for the representation of diagrams and statistics. The contact data is automatically generated from standard PDB files by a tcl/tk-script running through the molecular visualization package VMD. Currently the database contains 536 interfaces extracted from 479 PDB files and it can be queried by various types of parameters. Here, we describe the database design and demonstrate its usefulness with a number of selected features.

  9. Trend analyses with river sediment rating curves

    Science.gov (United States)

    Warrick, Jonathan A.

    2015-01-01

    Sediment rating curves, which are fitted relationships between river discharge (Q) and suspended-sediment concentration (C), are commonly used to assess patterns and trends in river water quality. In many of these studies it is assumed that rating curves have a power-law form (i.e., C = aQb, where a and b are fitted parameters). Two fundamental questions about the utility of these techniques are assessed in this paper: (i) How well to the parameters, a and b, characterize trends in the data? (ii) Are trends in rating curves diagnostic of changes to river water or sediment discharge? As noted in previous research, the offset parameter, a, is not an independent variable for most rivers, but rather strongly dependent on b and Q. Here it is shown that a is a poor metric for trends in the vertical offset of a rating curve, and a new parameter, â, as determined by the discharge-normalized power function [C = â (Q/QGM)b], where QGM is the geometric mean of the Q values sampled, provides a better characterization of trends. However, these techniques must be applied carefully, because curvature in the relationship between log(Q) and log(C), which exists for many rivers, can produce false trends in â and b. Also, it is shown that trends in â and b are not uniquely diagnostic of river water or sediment supply conditions. For example, an increase in â can be caused by an increase in sediment supply, a decrease in water supply, or a combination of these conditions. Large changes in water and sediment supplies can occur without any change in the parameters, â and b. Thus, trend analyses using sediment rating curves must include additional assessments of the time-dependent rates and trends of river water, sediment concentrations, and sediment discharge.

  10. BN-600 hybrid core benchmark analyses

    International Nuclear Information System (INIS)

    Kim, Y.I.; Stanculescu, A.; Finck, P.; Hill, R.N.; Grimm, K.N.

    2003-01-01

    Benchmark analyses for the hybrid BN-600 reactor that contains three uranium enrichment zones and one plutonium zone in the core, have been performed within the frame of an IAEA sponsored Coordinated Research Project. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN-600 core were evaluated. The comparison of the diffusion and transport results obtained for the homogeneous representation generally shows good agreement for most parameters between the RZ and HEX-Z models. The burnup effect and the heterogeneity effect on most reactivity parameters also show good agreement for the HEX-Z diffusion and transport theory results. A large difference noticed for the sodium and steel density coefficients is mainly due to differences in the spatial coefficient predictions for non fuelled regions. The burnup reactivity loss was evaluated to be 0.025 (4.3 $) within ∼ 5.0% standard deviation. The heterogeneity effect on most reactivity coefficients was estimated to be small. The heterogeneity treatment reduced the control rod worth by 2.3%. The heterogeneity effect on the k-eff and control rod worth appeared to differ strongly depending on the heterogeneity treatment method. A substantial spread noticed for several reactivity coefficients did not give a significant impact on the transient behavior prediction. This result is attributable to compensating effects between several reactivity effects and the specific design of the partially MOX fuelled hybrid core. (author)

  11. Clinical guideline implementation strategies for common mental health disorders.

    Science.gov (United States)

    Moreno, Eliana María; Moriana, Juan Antonio

    2016-01-01

    There has been a considerable proliferation of clinical guidelines recently, but their practical application is low, and organisations do not always implement their own ones. The aim of this study is to analyse and describe key elements of strategies and resources designed by the National Institute for Health and Care Excellence for the implementation of guidelines for common mental health disorders in adults, which are some of the most prevalent worldwide. A systematic review was performed following PRISMA model. Resources, tools and implementation materials where included and categorised considering type, objectives, target and scope. A total of 212 elements were analysed, of which 33.5 and 24.5% are related to the implementation of generalized anxiety and depression guidelines, respectively. Applied tools designed to estimate costs and assess the feasibility of the setting up at local level are the most frequent type of resource. The study highlights the important variety of available materials, classified into 3 main strategies: tools targeting the professionals (30.6%), structural (26.4%), and organizational (24%). Developing guidelines is not enough; it is also necessary to promote their implementation in order to encourage their application. The resources and strategies described in this study may be potentially applicable to other contexts, and helpful to public health managers and professionals in the design of programmes and in the process of informed decision making to help increase access to efficient treatments. Copyright © 2015. Published by Elsevier España.

  12. Implementing Open Innovation: The Case of Natura, IBM and Siemens

    Directory of Open Access Journals (Sweden)

    Cely Ades

    2013-05-01

    Full Text Available This paper analyses three case firms whose innovation management processes have been consolidated. The companies Natura, IBM (Brazilian Subsidiary and Siemens (ChemTech/Brazil were studied with the purpose of analysing the implementation of OI, particularly in terms of: (a its alignment with existing corporate strategy; (b its requirements such as culture, skill and motivation; (c the strategy and the implementation process; (d the results achieved (e the present barriers and enablers. The research is qualitative in nature and employs a descriptive approach. The main results of this study, obtained using a method called ‘Collective Subject Speech’ , show that the implementation of OI, both structured and non-structured, is mainly challenged by cultural issues. It has been observed that the implementation of OI process is at its embryonic stage in all case firms and that this occurs along with investments in closed innovation, meaning that OI results cannot be explored at this stage of the implementation, as there is a long way to consolidate these practices in the case firms studied.

  13. Australia; Basel II Implementation Assessment

    OpenAIRE

    International Monetary Fund

    2010-01-01

    The key findings of Australia’s BASEL II implementation assessment are presented. The Australian Prudential Regulation Authority (APRA) allocated sufficient resources, including highly skilled staff, prior to the Basel II start date, and the outcome has been a robust and high-quality implementation that has built upon and substantially strengthened the risk-management capabilities of major banks. The quality of leadership and commitment by all involved has been instrumental in the success of ...

  14. Document Management Projects: implementation guide

    OpenAIRE

    Beatriz Bagoin Guimarães

    2016-01-01

    Records Management System implementation is a complex process that needs to be executed by a multidisciplinary team and involves components of apparently non-related areas such as archival science, computer engineering, law, project management and human resource management. All of them are crucial and complementary to guarantee a full and functional implementation of a system and a perfect fusion with the connected processes and procedures. The purpose of this work is to provide organizations...

  15. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  16. Auditing Marketing Strategy Implementation Success

    OpenAIRE

    Herhausen, Dennis; Egger, Thomas; Oral, Cansu

    2014-01-01

    What makes a marketing strategy implementation successful and how can managers measure this success? To answer these questions, we developed a two-step audit approach. First, managers should measure the implementation success regarding effectiveness, efficiency, performance outcomes, and strategic embeddedness. Second, they should explore the reasons that have led to success or failure by regarding managerial, leadership, and environmental traps. Doing so will also provide corrective action p...

  17. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  18. Implementing an Open Source Learning Management System: A Critical Analysis of Change Strategies

    Science.gov (United States)

    Uys, Philip M.

    2010-01-01

    This paper analyses the change and innovation strategies that Charles Sturt University (CSU) used from 2007 to 2009 during the implementation and mainstreaming of an open source learning management system (LMS), Sakai, named locally as "CSU Interact". CSU was in January 2008 the first Australian University to implement an open source…

  19. Implementing a Flipped Classroom Approach in a University Numerical Methods Mathematics Course

    Science.gov (United States)

    Johnston, Barbara M.

    2017-01-01

    This paper describes and analyses the implementation of a "flipped classroom" approach, in an undergraduate mathematics course on numerical methods. The approach replaced all the lecture contents by instructor-made videos and was implemented in the consecutive years 2014 and 2015. The sequential case study presented here begins with an…

  20. Promoting Action on Research Implementation in Health Services framework applied to TeamSTEPPS implementation in small rural hospitals.

    Science.gov (United States)

    Ward, Marcia M; Baloh, Jure; Zhu, Xi; Stewart, Greg L

    A particularly useful model for examining implementation of quality improvement interventions in health care settings is the PARIHS (Promoting Action on Research Implementation in Health Services) framework developed by Kitson and colleagues. The PARIHS framework proposes three elements (evidence, context, and facilitation) that are related to successful implementation. An evidence-based program focused on quality enhancement in health care, termed TeamSTEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety), has been widely promoted by the Agency for Healthcare Research and Quality, but research is needed to better understand its implementation. We apply the PARIHS framework in studying TeamSTEPPS implementation to identify elements that are most closely related to successful implementation. Quarterly interviews were conducted over a 9-month period in 13 small rural hospitals that implemented TeamSTEPPS. Interview quotes that were related to each of the PARIHS elements were identified using directed content analysis. Transcripts were also scored quantitatively, and bivariate regression analysis was employed to explore relationships between PARIHS elements and successful implementation related to planning activities. The current findings provide support for the PARIHS framework and identified two of the three PARIHS elements (context and facilitation) as important contributors to successful implementation. This study applies the PARIHS framework to TeamSTEPPS, a widely used quality initiative focused on improving health care quality and patient safety. By focusing on small rural hospitals that undertook this quality improvement activity of their own accord, our findings represent effectiveness research in an understudied segment of the health care delivery system. By identifying context and facilitation as the most important contributors to successful implementation, these analyses provide a focus for efficient and effective sustainment of Team

  1. Gen IV Materials Handbook Implementation Plan

    International Nuclear Information System (INIS)

    Rittenhouse, P.; Ren, W.

    2005-01-01

    A Gen IV Materials Handbook is being developed to provide an authoritative single source of highly qualified structural materials information and materials properties data for use in design and analyses of all Generation IV Reactor Systems. The Handbook will be responsive to the needs expressed by all of the principal government, national laboratory, and private company stakeholders of Gen IV Reactor Systems. The Gen IV Materials Handbook Implementation Plan provided here addresses the purpose, rationale, attributes, and benefits of the Handbook and will detail its content, format, quality assurance, applicability, and access. Structural materials, both metallic and ceramic, for all Gen IV reactor types currently supported by the Department of Energy (DOE) will be included in the Gen IV Materials Handbook. However, initial emphasis will be on materials for the Very High Temperature Reactor (VHTR). Descriptive information (e.g., chemical composition and applicable technical specifications and codes) will be provided for each material along with an extensive presentation of mechanical and physical property data including consideration of temperature, irradiation, environment, etc. effects on properties. Access to the Gen IV Materials Handbook will be internet-based with appropriate levels of control. Information and data in the Handbook will be configured to allow search by material classes, specific materials, specific information or property class, specific property, data parameters, and individual data points identified with materials parameters, test conditions, and data source. Details on all of these as well as proposed applicability and consideration of data quality classes are provided in the Implementation Plan. Website development for the Handbook is divided into six phases including (1) detailed product analysis and specification, (2) simulation and design, (3) implementation and testing, (4) product release, (5) project/product evaluation, and (6) product

  2. Analysing the Effects of Flood-Resilience Technologies in Urban Areas Using a Synthetic Model Approach

    Directory of Open Access Journals (Sweden)

    Reinhard Schinke

    2016-11-01

    Full Text Available Flood protection systems with their spatial effects play an important role in managing and reducing flood risks. The planning and decision process as well as the technical implementation are well organized and often exercised. However, building-related flood-resilience technologies (FReT are often neglected due to the absence of suitable approaches to analyse and to integrate such measures in large-scale flood damage mitigation concepts. Against this backdrop, a synthetic model-approach was extended by few complementary methodical steps in order to calculate flood damage to buildings considering the effects of building-related FReT and to analyse the area-related reduction of flood risks by geo-information systems (GIS with high spatial resolution. It includes a civil engineering based investigation of characteristic properties with its building construction including a selection and combination of appropriate FReT as a basis for derivation of synthetic depth-damage functions. Depending on the real exposition and the implementation level of FReT, the functions can be used and allocated in spatial damage and risk analyses. The application of the extended approach is shown at a case study in Valencia (Spain. In this way, the overall research findings improve the integration of FReT in flood risk management. They provide also some useful information for advising of individuals at risk supporting the selection and implementation of FReT.

  3. Implementing Speed and Separation Monitoring in Collaborative Robot Workcells

    Science.gov (United States)

    Marvel, Jeremy A.; Norcross, Rick

    2016-01-01

    We provide an overview and guidance for the Speed and Separation Monitoring methodology as presented in the International Organization of Standardization's technical specification 15066 on collaborative robot safety. Such functionality is provided by external, intelligent observer systems integrated into a robotic workcell. The SSM minimum protective distance function equation is discussed in detail, with consideration for the input values, implementation specifications, and performance expectations. We provide analytical analyses and test results of the current equation, discuss considerations for implementing SSM in human-occupied environments, and provide directions for technological advancements toward standardization. PMID:27885312

  4. Implementing Parallel Google Map-Reduce in Eden

    DEFF Research Database (Denmark)

    Berthold, Jost; Dieterle, Mischa; Loogen, Rita

    2009-01-01

    Recent publications have emphasised map-reduce as a general programming model (labelled Google map-reduce), and described existing high-performance implementations for large data sets. We present two parallel implementations for this Google map-reduce skeleton, one following earlier work, and one...... of the Google map-reduce skeleton in usage and performance, and deliver runtime analyses for example applications. Although very flexible, the Google map-reduce skeleton is often too general, and typical examples reveal a better runtime behaviour using alternative skeletons....

  5. On MPR-OSPF Specification and Implementation in Quagga/GTNetS

    OpenAIRE

    Cordero , Juan Antonio

    2008-01-01

    This document analyses the MPR-OSPF current specification and compares it with the implemented version for the Quagga / Zebra routing suite, adapted for the GTNetS network simulator. It presents the relationship between Quagga/Zebra core and the GTNetS simulation framework, describes the inner architecture of the MPR-OSPF extension in the OSPF Quagga general implementation and identifies the different protocol main elements in the implemented code.

  6. Marketing Mix Implementation in Small Medium Enterprises: a Study of Galeristorey Online Business

    OpenAIRE

    Sari, Rora Puspita

    2017-01-01

    The purpose of this paper is to evaluate the implementation of marketing mix in online business company, whether the online business adopt solely the traditional marketing mix model or the internet factors is also included, since the business platform itself in on social media. Descriptive research and content analysis using interview and observation were used to analyse the marketing mix implementation in Galeristorey online business. Evidence suggested that Galeristorey implemented few elem...

  7. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    Science.gov (United States)

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  8. Implementing Safeguards-by-Design

    International Nuclear Information System (INIS)

    Bjornard, Trond; Bean, Robert; Durst, Phillip Casey; Hockert, John; Morgan, James

    2010-01-01

    Excerpt Safeguards-by-Design (SBD) is an approach to the design and construction of nuclear facilities whereby safeguards are designed-in from the very beginning. It is a systematic and structured approach for fully integrating international and national safeguards (MC and A), physical security, and other proliferation barriers into the design and construction process for nuclear facilities. SBD is primarily a project management or project coordination challenge, and this report focuses on that aspect of SBD. The present report continues the work begun in 2008 and focuses specifically on the design process, or project management and coordination - the planning, definition, organization, coordination, scheduling and interaction of activities of the safeguards experts and stakeholders as they participate in the design and construction of a nuclear facility. It delineates the steps in a nuclear facility design and construction project, in order to provide the project context within which the safeguards design activities take place, describes the involvement of safeguards experts in the design process, the nature of their analyses, interactions and decisions, as well as describing the documents created and how they are used. Designing and constructing a nuclear facility is an extremely complex undertaking. The stakeholders in an actual project are many - owner, operator, State regulators, nuclear facility primary contractor, subcontractors (e.g. instrument suppliers), architect engineers, project management team, safeguards, safety and security experts, in addition to the IAEA and its team. The purpose of the present report is to provide a common basis for discussions amongst stakeholders to collaboratively develop a SBD approach that will be both practically useful and mutually beneficial. The principal conclusions from the present study are: (1) In the short term, the successful implementation of SBD is principally a project management problem. (2) Life-cycle cost

  9. Vibro-spring particle size distribution analyser

    International Nuclear Information System (INIS)

    Patel, Ketan Shantilal

    2002-01-01

    This thesis describes the design and development of an automated pre-production particle size distribution analyser for particles in the 20 - 2000 μm size range. This work is follow up to the vibro-spring particle sizer reported by Shaeri. In its most basic form, the instrument comprises a horizontally held closed coil helical spring that is partly filled with the test powder and sinusoidally vibrated in the transverse direction. Particle size distribution data are obtained by stretching the spring to known lengths and measuring the mass of the powder discharged from the spring's coils. The size of the particles on the other hand is determined from the spring 'intercoil' distance. The instrument developed by Shaeri had limited use due to its inability to measure sample mass directly. For the device reported here, modifications are made to the original configurations to establish means of direct sample mass measurement. The feasibility of techniques for measuring the mass of powder retained within the spring are investigated in detail. Initially, the measurement of mass is executed in-situ from the vibration characteristics based on the spring's first harmonic resonant frequency. This method is often erratic and unreliable due to the particle-particle-spring wall interactions and the spring bending. An much more successful alternative is found from a more complicated arrangement in which the spring forms part of a stiff cantilever system pivoted along its main axis. Here, the sample mass is determined in the 'static mode' by monitoring the cantilever beam's deflection following the wanton termination of vibration. The system performance has been optimised through the variations of the mechanical design of the key components and the operating procedure as well as taking into account the effect of changes in the ambient temperature on the system's response. The thesis also describes the design and development of the ancillary mechanisms. These include the pneumatic

  10. Kuosheng Mark III containment analyses using GOTHIC

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Ansheng, E-mail: samuellin1999@iner.gov.tw; Chen, Yen-Shu; Yuann, Yng-Ruey

    2013-10-15

    Highlights: • The Kuosheng Mark III containment model is established using GOTHIC. • Containment pressure and temperature responses due to LOCA are presented. • The calculated results are all below the design values and compared with the FSAR results. • The calculated results can be served as an analysis reference for an SPU project in the future. -- Abstract: Kuosheng nuclear power plant in Taiwan is a twin-unit BWR/6 plant, and both units utilize the Mark III containment. Currently, the plant is performing a stretch power uprate (SPU) project to increase the core thermal power to 103.7% OLTP (original licensed thermal power). However, the containment response in the Kuosheng Final Safety Analysis Report (FSAR) was completed more than twenty-five years ago. The purpose of this study is to establish a Kuosheng Mark III containment model using the containment program GOTHIC. The containment pressure and temperature responses under the design-basis accidents, which are the main steam line break (MSLB) and the recirculation line break (RCLB) accidents, are investigated. Short-term and long-term analyses are presented in this study. The short-term analysis is to calculate the drywell peak pressure and temperature which happen in the early stage of the LOCAs. The long-term analysis is to calculate the peak pressure and temperature of the reactor building space. In the short-term analysis, the calculated peak drywell to wetwell differential pressure is 140.6 kPa for the MSLB, which is below than the design value of 189.6 kPa. The calculated peak drywell temperature is 158 °C, which is still below the design value of 165.6 °C. In addition, in the long-term analysis, the calculated peak containment pressure is 47 kPa G, which is below the design value of 103.4 kPa G. The calculated peak values of containment temperatures are 74.7 °C, which is lower than the design value of 93.3 °C. Therefore, the Kuosheng Mark III containment can maintain the integrity after

  11. YALINA Booster subcritical assembly modeling and analyses

    International Nuclear Information System (INIS)

    Talamo, A.; Gohar, Y.; Aliberti, G.; Cao, Y.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Sadovich, S.

    2010-01-01

    Full text: Accurate simulation models of the YALINA Booster assembly of the Joint Institute for Power and Nuclear Research (JIPNR)-Sosny, Belarus have been developed by Argonne National Laboratory (ANL) of the USA. YALINA-Booster has coupled zones operating with fast and thermal neutron spectra, which requires a special attention in the modelling process. Three different uranium enrichments of 90%, 36% or 21% were used in the fast zone and 10% uranium enrichment was used in the thermal zone. Two of the most advanced Monte Carlo computer programs have been utilized for the ANL analyses: MCNP of the Los Alamos National Laboratory and MONK of the British Nuclear Fuel Limited and SERCO Assurance. The developed geometrical models for both computer programs modelled all the details of the YALINA Booster facility as described in the technical specifications defined in the International Atomic Energy Agency (IAEA) report without any geometrical approximation or material homogenization. Materials impurities and the measured material densities have been used in the models. The obtained results for the neutron multiplication factors calculated in criticality mode (keff) and in source mode (ksrc) with an external neutron source from the two Monte Carlo programs are very similar. Different external neutron sources have been investigated including californium, deuterium-deuterium (D-D), and deuterium-tritium (D-T) neutron sources. The spatial neutron flux profiles and the neutron spectra in the experimental channels were calculated. In addition, the kinetic parameters were defined including the effective delayed neutron fraction, the prompt neutron lifetime, and the neutron generation time. A new calculation methodology has been developed at ANL to simulate the pulsed neutron source experiments. In this methodology, the MCNP code is used to simulate the detector response from a single pulse of the external neutron source and a C code is used to superimpose the pulse until the

  12. PERFORMANCE ANALYSIS OF PROJECTS IMPLEMENTATION IN HEIS

    Directory of Open Access Journals (Sweden)

    Sergiu-Vlad PETCU

    2015-04-01

    Full Text Available According to the European Commission, Romania is still facing significant mismatch between the skills of graduates of tertiary education and the market needs. This paper is highly relevant for the future implementation of HEIs, since they have a strong economic role and they can significantly influence long-term national social and economical development. Romania did not manage to materialize community funds allocated for the 2007-2013 programming period to its full potential. Accordingly, the proposed thesis intends to analyse how Romanian HEIs were able to manage community resources attracted by grants and how the implemented projects achieved their set objectives. According to the data collected so far, the results will show that the performance of the projects is directly dependent on the proportion of fully dedicated staff in organizational design of the project. The originality of the undertaken study is that it starts from a realistic approach, according to which the rules of the game should be adapted, depending on the players (in our case – HEIs.

  13. Implementing The Safeguards-By-Design Process

    International Nuclear Information System (INIS)

    Whitaker, J. Michael; McGinnis, Brent; Laughter, Mark D.; Morgan, Jim; Bjornard, Trond; Bean, Robert; Durst, Phillip; Hockert, John; DeMuth, Scott; Lockwood, Dunbar

    2010-01-01

    The Safeguards-by-Design (SBD) approach incorporates safeguards into the design and construction of nuclear facilities at the very beginning of the design process. It is a systematic and structured approach for fully integrating international and national safeguards for material control and accountability (MC and A), physical protection, and other proliferation barriers into the design and construction process for nuclear facilities. Implementing SBD is primarily a project management or project coordination challenge. This paper focuses specifically on the design process; the planning, definition, organization, coordination, scheduling and interaction of the safeguards experts and stakeholders as they participate in the design and construction of a nuclear facility. It delineates the steps in a nuclear facility design and construction project in order to provide the project context within which the safeguards design activities take place, describes the involvement of the safeguards experts in the design process, the nature of their analyses, interactions and decisions, and describes the documents created and how they are used. This report highlights the project context of safeguards activities, and identifies the safeguards community (nuclear facility operator, designer/builder, state regulator, SSAC and IAEA) must accomplish in order to implement SBD within the project.

  14. First Super-Earth Atmosphere Analysed

    Science.gov (United States)

    2010-12-01

    The atmosphere around a super-Earth exoplanet has been analysed for the first time by an international team of astronomers using ESO's Very Large Telescope. The planet, which is known as GJ 1214b, was studied as it passed in front of its parent star and some of the starlight passed through the planet's atmosphere. We now know that the atmosphere is either mostly water in the form of steam or is dominated by thick clouds or hazes. The results will appear in the 2 December 2010 issue of the journal Nature. The planet GJ 1214b was confirmed in 2009 using the HARPS instrument on ESO's 3.6-metre telescope in Chile (eso0950) [1]. Initial findings suggested that this planet had an atmosphere, which has now been confirmed and studied in detail by an international team of astronomers, led by Jacob Bean (Harvard-Smithsonian Center for Astrophysics), using the FORS instrument on ESO's Very Large Telescope. "This is the first super-Earth to have its atmosphere analysed. We've reached a real milestone on the road toward characterising these worlds," said Bean. GJ 1214b has a radius of about 2.6 times that of the Earth and is about 6.5 times as massive, putting it squarely into the class of exoplanets known as super-Earths. Its host star lies about 40 light-years from Earth in the constellation of Ophiuchus (the Serpent Bearer). It is a faint star [2], but it is also small, which means that the size of the planet is large compared to the stellar disc, making it relatively easy to study [3]. The planet travels across the disc of its parent star once every 38 hours as it orbits at a distance of only two million kilometres: about seventy times closer than the Earth orbits the Sun. To study the atmosphere, the team observed the light coming from the star as the planet passed in front of it [4]. During these transits, some of the starlight passes through the planet's atmosphere and, depending on the chemical composition and weather on the planet, specific wavelengths of light are

  15. Implementation and rejection of industrial steam system energy efficiency measures

    International Nuclear Information System (INIS)

    Therkelsen, Peter; McKane, Aimee

    2013-01-01

    Steam systems consume approximately one third of energy applied at US industrial facilities. To reduce energy consumption, steam system energy assessments have been conducted on a wide range of industry types over the course of 5 years through the Energy Savings Assessment (ESA) program administered by the US Department of Energy (US DOE). ESA energy assessments result in energy efficiency measure recommendations that are given potential energy and energy cost savings and potential implementation cost values. Saving and cost metrics that measure the impact recommended measures will have at facilities, described as percentages of facility baseline energy and energy cost, are developed from ESA data and used in analyses. Developed savings and cost metrics are examined along with implementation and rejection rates of recommended steam system energy efficiency measures. Based on analyses, implementation of steam system energy efficiency measures is driven primarily by cost metrics: payback period and measure implementation cost as a percentage of facility baseline energy cost (implementation cost percentage). Stated reasons for rejecting recommended measures are primarily based upon economic concerns. Additionally, implementation rates of measures are not only functions of savings and cost metrics, but time as well. - Highlights: ► We examine uptake/rejection of industrial steam system energy efficiency measures. ► We examine metrics that correspond to uptake/rejection of recommended measures. ► We examine barriers hindering steam system energy efficiency measure implementation. ► Uptake/rejection of steam measures is linked to potential cost metrics. ► Increased uptake of measures and uptake of more costly measures increases with time

  16. Measuring Collaboration and Communication to Increase Implementation of Evidence-Based Practices: The Cultural Exchange Inventory

    Science.gov (United States)

    Palinkas, Lawrence A.; Garcia, Antonio; Aarons, Gregory; Finno-Velasquez, Megan; Fuentes, Dahlia; Holloway, Ian; Chamberlain, Patricia

    2018-01-01

    The Cultural Exchange Inventory (CEI) is a 15-item instrument designed to measure the process (7 items) and outcomes (8 items) of exchanges of knowledge, attitudes and practices between members of different organisations collaborating in implementing evidence-based practice. We conducted principal axis factor analyses and parallel analyses of data…

  17. National Inspection Program of Conventional Industries: implement, results and evaluation- 1981 to 1984

    International Nuclear Information System (INIS)

    Gloria, M.B.; Silva, F.C.A. da; Leocadio, J.C.; Valenca, J.R.M.; Farias, C.

    1986-01-01

    The methodology adopted by the Instutute of Radiation Protection and Dosimetry to implement the National Inspection Program of Conventional Industries is present. This methodology is being efficient because of many technical and administrative problems about radiation protection could be identified, analysed and solved gradually. Many workplaces of gammagraphy are analysed in relation to radiation safety, geographyc localization and social-economics aspects. (Author) [pt

  18. Studies and analyses of the management of scientific research and development, including implementation and application at NASA centers

    Science.gov (United States)

    Rubenstein, A. H.

    1975-01-01

    Summary results obtained through the Program of Research on the Management of Research and Development (POMRAD) were presented. The nature of the overall program and the specific projects undertaken were described. Statistical data is also given concerning the papers, publications, people, and major program areas associated with the program. The actual list of papers, names of doctoral and masters theses, and other details of the program are included as appendices.

  19. Combining Geoelectrical Measurements and CO 2 Analyses to Monitor the Enhanced Bioremediation of Hydrocarbon-Contaminated Soils: A Field Implementation

    OpenAIRE

    Noel , Cécile; Gourry , Jean-Christophe; Deparis , Jacques; Blessing , Michaela; Ignatiadis , Ioannis; Guimbaud , Christophe

    2016-01-01

    International audience; Hydrocarbon-contaminated aquifers can be successfully remediated through enhanced biodegradation. However, in situ monitoring of the treatment by piezometers is expensive and invasive and might be insufficient as the information provided is restricted to vertical profiles at discrete locations. An alternative method was tested in order to improve the robustness of the monitoring. Geophysical methods, electrical resistivity (ER) and induced polarization (IP), were combi...

  20. Systems reliability analyses and risk analyses for the licencing procedure under atomic law

    International Nuclear Information System (INIS)

    Berning, A.; Spindler, H.

    1983-01-01

    For the licencing procedure under atomic law in accordance with Article 7 AtG, the nuclear power plant as a whole needs to be assessed, plus the reliability of systems and plant components that are essential to safety are to be determined with probabilistic methods. This requirement is the consequence of safety criteria for nuclear power plants issued by the Home Department (BMI). Systems reliability studies and risk analyses used in licencing procedures under atomic law are identified. The stress is on licencing decisions, mainly for PWR-type reactors. Reactor Safety Commission (RSK) guidelines, examples of reasoning in legal proceedings and arguments put forth by objectors are also dealt with. Correlations are shown between reliability analyses made by experts and licencing decisions by means of examples. (orig./HP) [de

  1. Implementation of integrated management system

    International Nuclear Information System (INIS)

    Gaspar Junior, Joao Carlos A.; Fonseca, Victor Zidan da

    2007-01-01

    In present day exist quality assurance system, environment, occupational health and safety such as ISO9001, ISO14001 and OHSAS18001 and others standards will can create. These standards can be implemented and certified they guarantee one record system, quality assurance, documents control, operational control, responsibility definition, training, preparing and serve to emergency, monitoring, internal audit, corrective action, continual improvement, prevent of pollution, write procedure, reduce costs, impact assessment, risk assessment , standard, decree, legal requirements of municipal, state, federal and local scope. These procedure and systems when isolate applied cause many management systems and bureaucracy. Integration Management System reduce to bureaucracy, excess of documents, documents storage and conflict documents and easy to others standards implementation in future. The Integrated Management System (IMS) will be implemented in 2007. INB created a management group for implementation, this group decides planing, works, policy and advertisement. Legal requirements were surveyed, internal audits, pre-audits and audits were realized. INB is partially in accordance with ISO14001, OSHAS18001 standards. But very soon, it will be totally in accordance with this norms. Many studies and works were contracted to deal with legal requirements. This work have intention of show implementation process of ISO14001, OHSAS18001 and Integrated Management System on INB. (author)

  2. Pollution prevention program implementation plan

    International Nuclear Information System (INIS)

    Engel, J.A.

    1996-09-01

    The Pollution Prevention Program Implementation Plan (the Plan) describes the Pacific Northwest National Laboratory's (PNNL) Pollution Prevention (P2) Program. The Plan also shows how the P2 Program at PNNL will be in support of and in compliance with the Hanford Site Waste Minimization and Pollution Prevention (WMin/P2) Awareness Program Plan and the Hanford Site Guide for Preparing and Maintaining Generator Group Pollution Prevention Program Documentation. In addition, this plan describes how PNNL will demonstrate compliance with various legal and policy requirements for P2. This plan documents the strategy for implementing the PNNL P2 Program. The scope of the P2 Program includes implementing and helping to implement P2 activities at PNNL. These activities will be implemented according to the Environmental Protection Agency's (EPA) hierarchy of source reduction, recycling, treatment, and disposal. The PNNL P2 Program covers all wastes generated at the Laboratory. These include hazardous waste, low-level radioactive waste, radioactive mixed waste, radioactive liquid waste system waste, polychlorinated biphenyl waste, transuranic waste, and sanitary waste generated by activities at PNNL. Materials, resource, and energy conservation are also within the scope of the PNNL P2 Program

  3. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  4. Roadmap for Peridynamic Software Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Littlewood, David John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The application of peridynamics for engineering analysis requires an efficient and robust software implementation. Key elements include processing of the discretization, the proximity search for identification of pairwise interactions, evaluation of the con- stitutive model, application of a bond-damage law, and contact modeling. Additional requirements may arise from the choice of time integration scheme, for example esti- mation of the maximum stable time step for explicit schemes, and construction of the tangent stiffness matrix for many implicit approaches. This report summaries progress to date on the software implementation of the peridynamic theory of solid mechanics. Discussion is focused on parallel implementation of the meshfree discretization scheme of Silling and Askari [33] in three dimensions, although much of the discussion applies to computational peridynamics in general.

  5. Practical aspects of joint implementation

    International Nuclear Information System (INIS)

    Michaelowa, A.

    1995-01-01

    Article 4, 2a of the UN Framework Convention on Climate Change states the possibility of joint policies of different countries to achieve national greenhouse gas reduction commitments (Joint Implementation). The cost of reducing greenhouse gas emissions can be reduced drastically if industrialized countries shift abatement activities to developing countries as marginal cost of reduction is much higher in the former countries. In this way economic efficiency of abatement measures can be raised to the point where marginal cost is equal all over the world. At the Conference of the Parties in Berlin in March 1995, criteria for Joint Implementation are to be established. The paper discusses possible forms of Joint Implementation and develops criteria

  6. MDSplus objects-Python implementation

    Energy Technology Data Exchange (ETDEWEB)

    Fredian, T., E-mail: twf@psfc.mit.ed [Massachusetts Institute of Technology, Plasma Science and Fusion Center, NW17-268, 175 Albany Street, Cambridge, MA 02139 (United States); Stillerman, J. [Massachusetts Institute of Technology, Plasma Science and Fusion Center, NW17-268, 175 Albany Street, Cambridge, MA 02139 (United States); Manduchi, G. [Consorzio RFX, Euratom-ENEA Association, Corso Stati Uniti 4, Padova 35127 (Italy)

    2010-07-15

    MDSplus is a data acquisition and analysis software package used widely throughout the international fusion research community. During the past year, an important set of enhancements were designed under the project name of 'MDSobjects' which would provide a common, powerful application programming interface (API) to MDSplus in programming languages with object-oriented capabilities. This paper will discuss the Python language implementation of this API and some of the capabilities that this implementation provides for data storage and retrieval using the MDSplus system. We have implemented a new MDSplus Python module which exposes the MDSplus objects features to the language. The internal MDSplus programming language, TDI, has also been enhanced to be able to invoke Python commands from the TDI language. Now that Python is aware of the complex data structures in MDSplus such as Signals, the language becomes a very good candidate for applications ranging from data acquisition device support to analysis and visualization.

  7. MDSplus objects-Python implementation

    International Nuclear Information System (INIS)

    Fredian, T.; Stillerman, J.; Manduchi, G.

    2010-01-01

    MDSplus is a data acquisition and analysis software package used widely throughout the international fusion research community. During the past year, an important set of enhancements were designed under the project name of 'MDSobjects' which would provide a common, powerful application programming interface (API) to MDSplus in programming languages with object-oriented capabilities. This paper will discuss the Python language implementation of this API and some of the capabilities that this implementation provides for data storage and retrieval using the MDSplus system. We have implemented a new MDSplus Python module which exposes the MDSplus objects features to the language. The internal MDSplus programming language, TDI, has also been enhanced to be able to invoke Python commands from the TDI language. Now that Python is aware of the complex data structures in MDSplus such as Signals, the language becomes a very good candidate for applications ranging from data acquisition device support to analysis and visualization.

  8. Implementing interorganizational cooperation in labour market reintegration: a case study.

    Science.gov (United States)

    Ståhl, Christian

    2012-06-01

    To bring people with complex medical, social and vocational needs back to the labour market, interorganizational cooperation is often needed. Yet, studies of processes and strategies for achieving sustainable interorganizational cooperation are sparse. The aim of this study was to analyse the implementation processes of Swedish legislation on financial coordination, with specific focus on different strategies for and perspectives on implementing interorganizational cooperation. A multiple-case study was used, where two local associations for financial coordination were studied in order to elucidate and compare the development of cooperative work in two settings. The material, collected during a 3-year period, consisted of documents, individual interviews with managers, and focus groups with officials. Two different implementation strategies were identified. In case 1, a linear strategy was used to implement cooperative projects, which led to difficulties in maintaining cooperative work forms due to a fragmented and time-limited implementation process. In case 2, an interactive strategy was used, where managers and politicians were continuously involved in developing a central cooperation team that became a central part of a developing structure for interorganizational cooperation. An interactive cooperation strategy with long-term joint financing was here shown to be successful in overcoming organizational barriers to cooperation. It is suggested that a strategy based on adaptation to local conditions, flexibility and constant evaluation is preferred for developing sustainable interorganizational cooperation when implementing policies or legislation affecting interorganizational relationships.

  9. The layered learning practice model: Lessons learned from implementation.

    Science.gov (United States)

    Pinelli, Nicole R; Eckel, Stephen F; Vu, Maihan B; Weinberger, Morris; Roth, Mary T

    2016-12-15

    Pharmacists' views about the implementation, benefits, and attributes of a layered learning practice model (LLPM) were examined. Eligible and willing attending pharmacists at the same institution that had implemented an LLPM completed an individual, 90-minute, face-to-face interview using a structured interview guide developed by the interdisciplinary study team. Interviews were digitally recorded and transcribed verbatim without personal identifiers. Three researchers independently reviewed preliminary findings to reach consensus on emerging themes. In cases where thematic coding diverged, the researchers discussed their analyses until consensus was reached. Of 25 eligible attending pharmacists, 24 (96%) agreed to participate. The sample was drawn from both acute and ambulatory care practice settings and all clinical specialty areas. Attending pharmacists described several experiences implementing the LLPM and perceived benefits of the model. Attending pharmacists identified seven key attributes for hospital and health-system pharmacy departments that are needed to design and implement effective LLPMs: shared leadership, a systematic approach, good communication, flexibility for attending pharmacists, adequate resources, commitment, and evaluation. Participants also highlighted several potential challenges and obstacles for organizations to consider before implementing an LLPM. According to attending pharmacists involved in an LLPM, successful implementation of an LLPM required shared leadership, a systematic approach, communication, flexibility, resources, commitment, and a process for evaluation. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  10. What prevents organisations from implementing energy saving measures? Case studies of Norwegian public and commercial companies

    Energy Technology Data Exchange (ETDEWEB)

    Saele, Hanne; Naesje, Paal; Hagen, Oeivind [SINTEF Energy Research, Trondheim (Norway); Nordvik, Haavard [E-CO Tech, Oslo (Norway)

    2005-07-01

    The background for this project is analyses conducted in an industrial area with a capacity problem concerning electricity supply. To cope with the problem the network operator, in cooperation with the Norwegian Research Council, executed a project focusing on how to reduce peak loads and energy consumption. Technical and economical analyses of energy efficiency actions were offered to 40 companies and 20 out of these decided to implement the proposed actions. Two years later, 7 out of these 20 companies had not implemented the suggested actions or the starts were delayed. These cases were analysed based on personal interviews. The goal was to study the reasons for not implementing actions or for the delay. Most analyses of this kind analyse successful implementations. Here, however, the research issue is why organizations choose not to implement solutions that make sense, both economically and technically? Results suggest that information overload, bad timing, lack of personal address and formal responsibility for the report hindered companies from using the report as a basis for decision-making. Different aspects of financial management systems, such as rigid routines not allowing means for investments and aversion of less predictable costs, also hindered implementation. Despite these findings several organisations do have interest in energy saving and consumption, personnel that takes responsibility and financial incentives for reducing energy costs. Although the study is based on only a few cases to draw sound conclusions there are indications that, targeting the right organisations, energy efficiency is an interesting alternative to increasing power capacity.

  11. VLSI implementations for image communications

    CERN Document Server

    Pirsch, P

    1993-01-01

    The past few years have seen a rapid growth in image processing and image communication technologies. New video services and multimedia applications are continuously being designed. Essential for all these applications are image and video compression techniques. The purpose of this book is to report on recent advances in VLSI architectures and their implementation for video signal processing applications with emphasis on video coding for bit rate reduction. Efficient VLSI implementation for video signal processing spans a broad range of disciplines involving algorithms, architectures, circuits

  12. Implementing NetScaler VPX

    CERN Document Server

    Sandbu, Marius

    2014-01-01

    An easy-to-follow guide with detailed step-by step-instructions on how to implement the different key components in NetScaler, with real-world examples and sample scenarios.If you are a Citrix or network administrator who needs to implement NetScaler in your virtual environment to gain an insight on its functionality, this book is ideal for you. A basic understanding of networking and familiarity with some of the different Citrix products such as XenApp or XenDesktop is a prerequisite.

  13. Implementation of advanced inbound models

    OpenAIRE

    Koskinen, Juha

    2016-01-01

    The present Master’s Thesis was assigned by company operating in telecommuni-cations industry. The target of the Master’s Thesis was to understand what the biggest benefits are in implementing advanced inbound models into use and why it sometimes takes a longer time to finalize the implementation than planned. In addition the thesis aimed at clarifying how the usage of advanced inbound models should be measured and what the key performance indicators are that can verify the information. The g...

  14. Evaluating safety management system implementation

    International Nuclear Information System (INIS)

    Preuss, M.

    2009-01-01

    Canada is committed to not only maintaining, but also improving upon our record of having one of the safest aviation systems in the world. The development, implementation and maintenance of safety management systems is a significant step towards improving safety performance. Canada is considered a world leader in this area and we are fully engaged in implementation. By integrating risk management systems and business practices, the aviation industry stands to gain better safety performance with less regulatory intervention. These are important steps towards improving safety and enhancing the public's confidence in the safety of Canada's aviation system. (author)

  15. Implementation of Summon at KAUST

    KAUST Repository

    Buck, Stephen

    2017-03-09

    Web discovery services has evolved tremendously from the time of its introduction to present day. Their technology has enabled library users to obtain a variety of materials in different format and type faster and farther. Coupled with a myriad of features, users are ‘pampered’ with the options to email, download citations, full text articles, book chapters on different devices such as laptops, desktops and mobile devices (among others). In this session, the speaker will highlight KAUST library’s new discovery service (KORAL powered by Summon) launched in June 2016. Topics include the implementation project, its challenges and lessons learnt and the after-implementation initiatives.

  16. Land Use Control Implementation Plan

    Science.gov (United States)

    Starr, Andrew Scott

    2015-01-01

    This Land Use Control Implementation Plan (LUCIP) has been prepared to inform current and potential future users of Building M7-505 of institutional controls that have been implemented at the site. Although there are no current unacceptable risks to human health or the environment associated with Building M7-505, institutional land use controls (LUCs) are necessary to prohibit the use of groundwater from the site. LUCs are also necessary to prevent access to soil under electrical equipment in the northwest portion of the site. Controls necessary to prevent human exposure will include periodic inspection, condition certification, and agency notification.

  17. Enhancing implementation security of QKD

    Science.gov (United States)

    Tamaki, Kiyoshi

    2017-10-01

    Quantum key distribution (QKD) can achieve information-theoretic security, which is a provable security against any eavesdropping, given that all the devices the sender and the receiver employ operate exactly as the theory of security requires. Unfortunately, however, it is difficult for practical devices to meet all such requirements, and therefore more works have to be done toward guaranteeing information-theoretic security in practice, i.e., implementation security. In this paper, we review our recent efforts to enhance implementation security. We also have a brief look at a flaw in security proofs and present how to fix it.

  18. Implementing ‘Site BIM’

    DEFF Research Database (Denmark)

    Davies, Richard; Harty, Chris

    2013-01-01

    Numerous Building Information Modelling (BIM) tools are well established and potentially beneficial in certain uses. However, issues of adoption and implementation persist, particularly for on-site use of BIM tools in the construction phase. We describe an empirical case-study of the implementation...... of an innovative ‘Site BIM’ system on a major hospital construction project. The main contractor on the project developed BIM-enabled tools to allow site workers using mobile tablet personal computers to access design information and to capture work quality and progress data on-site. Accounts show that ‘Site BIM...

  19. System for implement draft reduction

    DEFF Research Database (Denmark)

    2008-01-01

    Abstract of WO 2008095503  (A1) There is disclosed a system and method of reducing draft forces when working soil with agricultural soil working implements (206, 211) creating draft forces, the soil working implements (208, 206, 211) beingoperable connectable to a frame (204, 304, 404, 504, 604......, the second part of the width comprising an other part of said width (212) than the first part (222), so as e.g. to reduce draft forces compared to working both first and second parts at the same time....

  20. Implementation of Summon at KAUST

    KAUST Repository

    Buck, Stephen; Ramli, Rindra M.

    2017-01-01

    Web discovery services has evolved tremendously from the time of its introduction to present day. Their technology has enabled library users to obtain a variety of materials in different format and type faster and farther. Coupled with a myriad of features, users are ‘pampered’ with the options to email, download citations, full text articles, book chapters on different devices such as laptops, desktops and mobile devices (among others). In this session, the speaker will highlight KAUST library’s new discovery service (KORAL powered by Summon) launched in June 2016. Topics include the implementation project, its challenges and lessons learnt and the after-implementation initiatives.

  1. ISMS Implementation in Nuclear Malaysia

    International Nuclear Information System (INIS)

    Radhiah Jamalludin; Siti Nurbahyah Hamdan; Mohd Dzul Aiman Aslan

    2015-01-01

    Nuclear Malaysia provides important services and functions that depend on the resources including information. Use of the information assets must be consistent with good professional practices and procedures and legal requirements, regulations and contracts and the need to ensure the confidentiality, integrity and availability of all information assets of the Agency. ISO / IEC 27001, the international safety standard for information security management system provides the mandatory requirement to implement, review and continuously improve the Information Security Management System (ISMS). Information security policies and the implementation of ISMS is important to protect information assets from all threats; internal or external; intentionally or unintentionally. (author)

  2. CONDITIONS FOR IMPLEMENTING ORGANIZATIONAL CHANGES

    Directory of Open Access Journals (Sweden)

    Renata Winkler

    2015-12-01

    Full Text Available Changes are one of the most typical phenomena experienced by contemporary organizations and are an inherent element of their functioning. The change introduction process is complex and it is often accompanied by a phenomenon of resistance to change on the part of the employees in an organization, which is considered as the main cause of failure in the change implementation process. The purpose of the article is to discuss the basic conditions for implementing changes related both to their adequate defining and overcoming resistance to change.

  3. The joint implementation mechanisms (MOC)

    International Nuclear Information System (INIS)

    2003-01-01

    The aim of the joint implementation mechanisms (MOC) is aims to favor the fight against the climatic change, by the implementing of activities, technologies and appropriate techniques emitting less greenhouse gases in south countries and by the possibility of reducing the greenhouse gases emissions for a more economical cost. This guide brings a practical assistance to the projects set-up: the possible concerned projects, the formalization of the project, the methodology, the involvement of the carbon credits in the project financing. (A.L.B.)

  4. Implementation of a cluster Beowulf

    International Nuclear Information System (INIS)

    Victorino Guzman, Jorge Enrique

    2001-01-01

    One of the simulation systems that put a great stress on computational resources and performance are the climatic models, with a high cost of implementation, making difficult its acquisition. An alternative that offers good performance at a reasonable cost is the construction of Cluster Beowulf that allows to emulate the behaviour of a computer with several processors. In the present article we discuss the requirements of hardware for the construction of the Cluster Beowulf, the software resources for the implementation of the model CCM3.6 and the performance of the Cluster Beowulf, of the Group of Investigation in Meteorology at the National University of Colombia, with different number of processors

  5. Microsoft Dynamics GP 2013 implementation

    CERN Document Server

    Yudin, Victoria

    2013-01-01

    A step-by-step guide for planning and carrying out your Microsoft Dynamics GP 2013 implementation. Detailed descriptions and illustrations of setup screens and practical examples and advice are included for the Dynamics GP system and core modules.If you are a new or existing Microsoft Dynamics GP consultant or an end user who wants to implement, install, and set up core modules of Dynamics GP 2013, then this book is for you. A basic understanding of business management systems and either Dynamics GP or a similar application is recommended.

  6. Secondary Data Analyses of Subjective Outcome Evaluation Data Based on Nine Databases

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2012-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong by analyzing 1,327 school-based program reports submitted by program implementers. In each report, program implementers were invited to write down five conclusions based on an integration of the subjective outcome evaluation data collected from the program participants and program implementers. Secondary data analyses were carried out by aggregating nine databases, with 14,390 meaningful units extracted from 6,618 conclusions. Results showed that most of the conclusions were positive in nature. The findings generally showed that the workers perceived the program and program implementers to be positive, and they also pointed out that the program could promote holistic development of the program participants in societal, familial, interpersonal, and personal aspects. However, difficulties encountered during program implementation (2.15% and recommendations for improvement were also reported (16.26%. In conjunction with the evaluation findings based on other strategies, the present study suggests that the Tier 1 Program of the Project P.A.T.H.S. is beneficial to the holistic development of the program participants.

  7. Numerical FEM Analyses of primary coolant system at NPP Temelin

    International Nuclear Information System (INIS)

    Junek, L.; Slovacek, M.; Ruzek, L.; Moulis, P.

    2003-01-01

    The main goal of this paper is to inform about the beginning and first steps of implementation of an aging management system at the Temelin NPP. The aging management system is important not only for achieving the current safety level but also for reaching operational reliability of a production unit equipment above the life time assumed by the original design, typically over 40 years. A method to locate the most prominent degradation regions is described. A global shell model of the primary coolant system including all loops and their components - reactor pressure vessel (RPV), steam generator (SG), main coolant pump (MCP), pressurizer, feed water and steam pipelines system is presented. The results of stress-strain analysis on the measured service parameters base are given. Validation of the results is very important and the method to compare the service measurement data with the numerical results is described. The global/local approach is mentioned and discussed. The effects of the complete global system on the individual components under monitoring are transformed into more accurate local spatial models. The local spatial models are used to analyze the gradual lifetime exhaustion of a facility during its service operation. Two spatial local models are presented, viz. feed water nozzle of SG and main coolant piping system T-brunch. The results of analysis of the local spatial models are processed by the neural network computing method, which is also described. The actual gradual damage of the material of the components under monitoring can be obtained based on the analyses performed and on the results from the neural network in combination with the knowledge of the real material characteristics. The procedures applied are included in the DIALIFE diagnostic system

  8. Implementing healthier foodservice guidelines in hospital and federal worksite cafeterias: barriers, facilitators and keys to success.

    Science.gov (United States)

    Jilcott Pitts, S B; Graham, J; Mojica, A; Stewart, L; Walter, M; Schille, C; McGinty, J; Pearsall, M; Whitt, O; Mihas, P; Bradley, A; Simon, C

    2016-12-01

    Healthy foodservice guidelines are being implemented in worksites and healthcare facilities to increase access to healthy foods by employees and public populations. However, little is known about the barriers to and facilitators of implementation. The present study aimed to examine barriers to and facilitators of implementation of healthy foodservice guidelines in federal worksite and hospital cafeterias. Using a mixed-methods approach, including a quantitative survey followed by a qualitative, in-depth interview, we examined: (i) barriers to and facilitators of implementation; (ii) behavioural design strategies used to promote healthier foods and beverages; and (iii) how implementation of healthy foodservice guidelines influenced costs and profitability. We used a purposive sample of five hospital and four federal worksite foodservice operators who recently implemented one of two foodservice guidelines: the United States Department of Health and Human Services/General Services Administration Health and Sustainability Guidelines ('Guidelines') in federal worksites or the Partnership for a Healthier America Hospital Healthier Food Initiative ('Initiative') in hospitals. Descriptive statistics were used to analyse quantitative survey data. Qualitative data were analysed using a deductive approach. Implementation facilitators included leadership support, adequate vendor selections and having dietitians assist with implementation. Implementation barriers included inadequate selections from vendors, customer complaints and additional expertise required for menu labelling. Behavioural design strategies used most frequently included icons denoting healthier options, marketing using social media and placement of healthier options in prime locations. Lessons learned can guide subsequent steps for future healthy foodservice guideline implementation in similar settings. © 2016 The British Dietetic Association Ltd.

  9. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  10. Forced vibration tests and simulation analyses of a nuclear reactor building. Part 2: simulation analyses

    International Nuclear Information System (INIS)

    Kuno, M.; Nakagawa, S.; Momma, T.; Naito, Y.; Niwa, M.; Motohashi, S.

    1995-01-01

    Forced vibration tests of a BWR-type reactor building. Hamaoka Unit 4, were performed. Valuable data on the dynamic characteristics of the soil-structure interaction system were obtained through the tests. Simulation analyses of the fundamental dynamic characteristics of the soil-structure system were conducted, using a basic lumped mass soil-structure model (lattice model), and strong correlation with the measured data was obtained. Furthermore, detailed simulation models were employed to investigate the effects of simultaneously induced vertical response and response of the adjacent turbine building on the lateral response of the reactor building. (author). 4 refs., 11 figs

  11. Organizational factors affecting safety implementation in food companies in Thailand.

    Science.gov (United States)

    Chinda, Thanwadee

    2014-01-01

    Thai food industry employs a massive number of skilled and unskilled workers. This may result in an industry with high incidences and accident rates. To improve safety and reduce the accident figures, this paper investigates factors influencing safety implementation in small, medium, and large food companies in Thailand. Five factors, i.e., management commitment, stakeholders' role, safety information and communication, supportive environment, and risk, are found important in helping to improve safety implementation. The statistical analyses also reveal that small, medium, and large food companies hold similar opinions on the risk factor, but bear different perceptions on the other 4 factors. It is also found that to improve safety implementation, the perceptions of safety goals, communication, feedback, safety resources, and supervision should be aligned in small, medium, and large companies.

  12. Lost in Implementation: EU Law Application in Albanian Legal System

    Directory of Open Access Journals (Sweden)

    Hajdini Bojana

    2017-06-01

    Full Text Available Considering the growing importance of the researchers in the area of Europeanization in the candidate countries, the purpose of this paper is to analyse whether, and to what extent EU as a legal normative power has influenced Albania to approximate existing and future legislation and to ensure proper implementation. The paper argues that the Europeanization process is pushing Albania toward greater convergence with EU acquis by developing a modern legal framework. However, the paper points out that weak implementation has hampered the application of EU law in Albania due to: a weak bureaucracy or uneven distribution of human capacities; b the lack of an established practice of consultation with interest groups on specific draft legislation, and c the inability to put in sound planning mechanisms and to carry out a realistic assessment. The paper concludes that effective adjustment of Albanian legal system with EU norms requires cooperation between different actors involved in the approximation and implementation process.

  13. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  14. Towards Implementation of Green Technology in Sabah Construction Industry

    Science.gov (United States)

    Azland Jainudin, Noor; Jugah, Ivy; Nasrizal Awang Ali, Awang; Tawie, Rudy

    2017-12-01

    The construction industry in Sabah is one of the major roles for development of social, economic infrastructures and buildings in generating wealth to the state besides the tourism sector. The increasing number of construction projects particularly in the rapid developing city of Kota Kinabalu, green technology as a whole is becoming more significant as it helps to develop effective solutions to encounter global environmental issues. The objective of the research is to identify the awareness and implementation of green technology in construction industry in Kota Kinabalu, Sabah. The methodology of the research is through distributing the questionnaire to the contractors, developers, consultants, architects and state government agencies to the area in Kota Kinabalu only. The questionnaires had been analysed to find out the mean value. 100 questionnaires distributed to the respondents but merely 85 questionnaires collected have been analysed. Based on the findings, 83.5% organisations were aware with the concept of green technology in construction project. In terms of the implementation only 64.7% had been implemented in their organizations. More than 50% from the major players such as contractors, consultants, developers, architects and state government agencies were aware based on six green technology concepts in their organizations. As a conclusion, the awareness towards green policy concept in construction industry is very satisfied. Meanwhile, in terms of implementation need to be increased the number of organizations to be involved in green technology in construction industry.

  15. Investigating the Implementation of Robotics.

    Science.gov (United States)

    1984-02-01

    decisions are going to be made participatively and also what participation means to all involved parties. Successful implementations of new technologies...Reporting Communication Communication that They Received Increased Workers’ eCommunication Understanding* Written communication 13% 4.3 Workplace meetings

  16. 340 Facility maintenance implementation plan

    International Nuclear Information System (INIS)

    1995-03-01

    This Maintenance Implementation Plan (MIP) has been developed for maintenance functions associated with the 340 Facility. This plan is developed from the guidelines presented by Department of Energy (DOE) Order 4330.4B, Maintenance Management Program (DOE 1994), Chapter II. The objective of this plan is to provide baseline information for establishing and identifying Westinghouse Hanford Company (WHC) conformance programs and policies applicable to implementation of DOE order 4330.4B guidelines. In addition, this maintenance plan identifies the actions necessary to develop a cost-effective and efficient maintenance program at the 340 Facility. Primary responsibility for the performance and oversight of maintenance activities at the 340 Facility resides with Westinghouse Hanford Company (WHC). Maintenance at the 340 Facility is performed by ICF-Kaiser Hanford (ICF-KH) South Programmatic Services crafts persons. This 340 Facility MIP provides interface requirements and responsibilities as they apply specifically to the 340 Facility. This document provides an implementation schedule which has been developed for items considered to be deficient or in need of improvement. The discussion sections, as applied to implementation at the 340 Facility, have been developed from a review of programs and practices utilizing the graded approach. Biennial review and additional reviews are conducted as significant programmatic and mission changes are made. This document is revised as necessary to maintain compliance with DOE requirements

  17. Manufacturer Usage Description Specification Implementation

    OpenAIRE

    Srinivasan, Kaushik

    2017-01-01

    Manufacturer Usage Description Specification (MUDS) is aframework under RFC development that aims to automate Internet access control rules for IoT devices . These access controls prevent malicious IoT devices from attacking other devices and also protect the IoT devices from being attacked by other devices.We are implementing this framework and trying to improve its security.

  18. Implementation of the Kyoto protocol

    International Nuclear Information System (INIS)

    2006-07-01

    The Rio Earth summit in 1992 has been the starting point of an international awareness about the global risk of climatic change. At this occasion, the richest countries committed themselves to stabilize their greenhouse gas emissions and to reach by the year 2000 an emissions level equivalent to the one of 1990. The Kyoto protocol in 1997 has permitted to convert this will into juridically constraining quantitative commitments. In 2005, Russia ratified the protocol while in 2001 the USA refused to do so. Because the commitments signed are ambitious, flexibility mechanisms have been implemented: 'emission permits' (emissions trading), 'joint implementation' allowing the investments abroad for greenhouse gases abatement in another developed country, and 'clean development mechanisms' when investments are made in a developing country. The Marrakech conference of December 2001 has permitted to fix up the eligibility criteria of projects belonging to the joint implementation and clean development mechanisms. The effective implementation of these mechanisms still raises technical difficulties to evaluate and measure the effective abatement of greenhouse gas emissions. (J.S.)

  19. Implementing Site-based Budgeting.

    Science.gov (United States)

    Sielke, Catherine C.

    2001-01-01

    Discusses five questions that must be answered before implementing site-based budgeting: Why are we doing this? What budgeting decisions will be devolved to the school site? How do dollars flow from the central office to the site? Who will be involved at the site? How will accountability be achieved? (Author/PKP)

  20. 340 Facility maintenance implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This Maintenance Implementation Plan (MIP) has been developed for maintenance functions associated with the 340 Facility. This plan is developed from the guidelines presented by Department of Energy (DOE) Order 4330.4B, Maintenance Management Program (DOE 1994), Chapter II. The objective of this plan is to provide baseline information for establishing and identifying Westinghouse Hanford Company (WHC) conformance programs and policies applicable to implementation of DOE order 4330.4B guidelines. In addition, this maintenance plan identifies the actions necessary to develop a cost-effective and efficient maintenance program at the 340 Facility. Primary responsibility for the performance and oversight of maintenance activities at the 340 Facility resides with Westinghouse Hanford Company (WHC). Maintenance at the 340 Facility is performed by ICF-Kaiser Hanford (ICF-KH) South Programmatic Services crafts persons. This 340 Facility MIP provides interface requirements and responsibilities as they apply specifically to the 340 Facility. This document provides an implementation schedule which has been developed for items considered to be deficient or in need of improvement. The discussion sections, as applied to implementation at the 340 Facility, have been developed from a review of programs and practices utilizing the graded approach. Biennial review and additional reviews are conducted as significant programmatic and mission changes are made. This document is revised as necessary to maintain compliance with DOE requirements.